Mar 09 13:20:02 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 13:20:03 crc restorecon[4702]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 13:20:03 crc restorecon[4702]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 13:20:04 crc kubenswrapper[4703]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:04 crc kubenswrapper[4703]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 13:20:04 crc kubenswrapper[4703]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:04 crc kubenswrapper[4703]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:04 crc kubenswrapper[4703]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 13:20:04 crc kubenswrapper[4703]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.446908 4703 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456056 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456092 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456098 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456105 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456113 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456119 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456124 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456131 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456137 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456145 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456150 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456155 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456159 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456164 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456168 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456174 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456179 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456184 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456188 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456192 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456197 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456203 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456207 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456212 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456217 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456221 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456226 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456230 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456235 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456240 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456245 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456249 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456254 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456259 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456264 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456269 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456274 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456280 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456284 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456289 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456294 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456300 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456305 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456309 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456314 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456318 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456323 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456327 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456332 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456337 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456341 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456346 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456350 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456355 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456360 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456364 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456368 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456373 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456377 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456383 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456388 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456396 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456401 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456406 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456410 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456415 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456419 4703 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456425 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456429 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456434 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.456439 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456555 4703 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456568 4703 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456577 4703 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456584 4703 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456592 4703 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456598 4703 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456605 4703 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456612 4703 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456618 4703 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456623 4703 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456629 4703 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456636 4703 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456642 4703 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456648 4703 flags.go:64] FLAG: --cgroup-root="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456653 4703 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456659 4703 flags.go:64] FLAG: --client-ca-file="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456664 4703 flags.go:64] FLAG: --cloud-config="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456670 4703 flags.go:64] FLAG: --cloud-provider="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456675 4703 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456683 4703 flags.go:64] FLAG: --cluster-domain="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456688 4703 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456694 4703 flags.go:64] FLAG: --config-dir="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456699 4703 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456705 4703 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456712 4703 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456717 4703 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456723 4703 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456728 4703 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456733 4703 flags.go:64] FLAG: --contention-profiling="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456740 4703 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456746 4703 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456751 4703 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456756 4703 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456766 4703 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456771 4703 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456776 4703 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456782 4703 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456788 4703 flags.go:64] FLAG: --enable-server="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456794 4703 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456799 4703 flags.go:64] FLAG: --event-burst="100" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456805 4703 flags.go:64] FLAG: --event-qps="50" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456810 4703 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456815 4703 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456821 4703 flags.go:64] FLAG: --eviction-hard="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456827 4703 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456832 4703 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456837 4703 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456857 4703 flags.go:64] FLAG: --eviction-soft="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456863 4703 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456868 4703 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456873 4703 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456878 4703 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456883 4703 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456888 4703 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456893 4703 flags.go:64] FLAG: --feature-gates="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456900 4703 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456905 4703 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456911 4703 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456916 4703 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456922 4703 flags.go:64] FLAG: --healthz-port="10248" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456927 4703 flags.go:64] FLAG: --help="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456933 4703 flags.go:64] FLAG: --hostname-override="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456938 4703 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456943 4703 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456949 4703 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456954 4703 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456959 4703 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456965 4703 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456970 4703 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456975 4703 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456981 4703 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456987 4703 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456993 4703 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.456999 4703 flags.go:64] FLAG: --kube-reserved="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457004 4703 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457009 4703 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457015 4703 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457020 4703 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457026 4703 flags.go:64] FLAG: --lock-file="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457031 4703 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457036 4703 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457041 4703 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457050 4703 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457055 4703 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457060 4703 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457065 4703 flags.go:64] FLAG: --logging-format="text" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457070 4703 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457076 4703 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457081 4703 flags.go:64] FLAG: --manifest-url="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457086 4703 flags.go:64] FLAG: --manifest-url-header="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457094 4703 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457100 4703 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457106 4703 flags.go:64] FLAG: --max-pods="110" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457112 4703 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457117 4703 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457123 4703 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457131 4703 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457136 4703 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457142 4703 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457147 4703 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457161 4703 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457166 4703 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457171 4703 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457178 4703 flags.go:64] FLAG: --pod-cidr="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457183 4703 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457191 4703 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457196 4703 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457201 4703 flags.go:64] FLAG: --pods-per-core="0" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457206 4703 flags.go:64] FLAG: --port="10250" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457218 4703 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457223 4703 flags.go:64] FLAG: --provider-id="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457228 4703 flags.go:64] FLAG: --qos-reserved="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457233 4703 flags.go:64] FLAG: --read-only-port="10255" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457238 4703 flags.go:64] FLAG: --register-node="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457243 4703 flags.go:64] FLAG: --register-schedulable="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457248 4703 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457257 4703 flags.go:64] FLAG: --registry-burst="10" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457262 4703 flags.go:64] FLAG: --registry-qps="5" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457267 4703 flags.go:64] FLAG: --reserved-cpus="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457272 4703 flags.go:64] FLAG: --reserved-memory="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457279 4703 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457284 4703 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457289 4703 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457294 4703 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457300 4703 flags.go:64] FLAG: --runonce="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457306 4703 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457311 4703 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457316 4703 flags.go:64] FLAG: --seccomp-default="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457322 4703 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457327 4703 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457333 4703 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457338 4703 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457343 4703 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457348 4703 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457354 4703 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457358 4703 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457364 4703 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457369 4703 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457374 4703 flags.go:64] FLAG: --system-cgroups="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457378 4703 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457387 4703 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457392 4703 flags.go:64] FLAG: --tls-cert-file="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457397 4703 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457404 4703 flags.go:64] FLAG: --tls-min-version="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457409 4703 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457419 4703 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457424 4703 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457430 4703 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457436 4703 flags.go:64] FLAG: --v="2" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457443 4703 flags.go:64] FLAG: --version="false" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457450 4703 flags.go:64] FLAG: --vmodule="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457456 4703 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457462 4703 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457582 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457590 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457594 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457598 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457603 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457607 4703 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457611 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457616 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457622 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457627 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457631 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457636 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457641 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457646 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457651 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457656 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457660 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457664 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457669 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457673 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457679 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457684 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457688 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457693 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457699 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457704 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457708 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457713 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457718 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457724 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457730 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457734 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457739 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457744 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457749 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457754 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457759 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457763 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457767 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457772 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457778 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457784 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457789 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457794 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457799 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457804 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457809 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457814 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457819 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457824 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457829 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457834 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457838 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457859 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457865 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457870 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457875 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457879 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457883 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457887 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457891 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457895 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457900 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457906 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457913 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457920 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457926 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457932 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457938 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457943 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.457948 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.457964 4703 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.474505 4703 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.474562 4703 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474696 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474710 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474720 4703 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474729 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474737 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474745 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474753 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474763 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474772 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474780 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474789 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474797 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474804 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474815 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474825 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474834 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474882 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474891 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474900 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474908 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474916 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474923 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474932 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474940 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474948 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474956 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474964 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474973 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474982 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474990 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.474997 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475005 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475013 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475021 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475031 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475042 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475053 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475063 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475072 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475082 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475092 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475102 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475112 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475121 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475131 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475141 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475151 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475160 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475169 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475177 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475185 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475194 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475201 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475212 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475225 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475236 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475245 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475256 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475266 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475277 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475287 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475295 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475304 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475311 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475319 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475327 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475335 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475342 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475350 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475358 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475366 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.475380 4703 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475646 4703 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475661 4703 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475670 4703 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475679 4703 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475688 4703 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475696 4703 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475704 4703 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475714 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475724 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475732 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475740 4703 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475748 4703 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475758 4703 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475770 4703 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475778 4703 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475787 4703 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475796 4703 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475804 4703 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475813 4703 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475821 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475830 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475838 4703 feature_gate.go:330] unrecognized feature gate: Example Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475875 4703 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475883 4703 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475891 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475899 4703 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475907 4703 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475915 4703 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475923 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475934 4703 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475945 4703 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475953 4703 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475962 4703 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475969 4703 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475978 4703 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475987 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.475995 4703 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476006 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476014 4703 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476022 4703 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476035 4703 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476048 4703 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476058 4703 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476070 4703 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476080 4703 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476089 4703 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476097 4703 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476105 4703 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476112 4703 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476120 4703 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476128 4703 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476135 4703 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476143 4703 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476152 4703 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476160 4703 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476169 4703 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476177 4703 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476185 4703 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476193 4703 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476201 4703 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476209 4703 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476219 4703 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476226 4703 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476236 4703 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476245 4703 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476253 4703 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476261 4703 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476269 4703 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476277 4703 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476285 4703 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.476292 4703 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.476306 4703 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.478083 4703 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.483364 4703 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.486987 4703 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.487066 4703 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.489108 4703 server.go:997] "Starting client certificate rotation" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.489134 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.489332 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.518631 4703 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.520734 4703 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.521921 4703 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.538499 4703 log.go:25] "Validated CRI v1 runtime API" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.583163 4703 log.go:25] "Validated CRI v1 image API" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.585618 4703 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.593766 4703 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-13-16-07-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.593812 4703 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.610469 4703 manager.go:217] Machine: {Timestamp:2026-03-09 13:20:04.60617847 +0000 UTC m=+0.573594166 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d1c69174-e64f-4790-ab29-a0802c299c7e BootID:c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0e:ef:68 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0e:ef:68 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:69:c6:fe Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b8:ff:9d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:89:99:cd Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ae:79:b2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:83:8c:f4:c3:bd Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:49:c8:95:01:52 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.610668 4703 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.610786 4703 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.611066 4703 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.611239 4703 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.611270 4703 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.611440 4703 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.611449 4703 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.612284 4703 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.612311 4703 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.612462 4703 state_mem.go:36] "Initialized new in-memory state store" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.612530 4703 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.618344 4703 kubelet.go:418] "Attempting to sync node with API server" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.618368 4703 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.618391 4703 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.618403 4703 kubelet.go:324] "Adding apiserver pod source" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.618412 4703 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.621657 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.621818 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.622880 4703 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.623099 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.623214 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.624270 4703 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.626632 4703 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628184 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628208 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628215 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628222 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628233 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628240 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628246 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628257 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628264 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628272 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628286 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.628293 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.630425 4703 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.630791 4703 server.go:1280] "Started kubelet" Mar 09 13:20:04 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.632646 4703 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.633241 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.633376 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.633421 4703 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.634369 4703 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.634469 4703 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.634497 4703 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.634623 4703 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.634679 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.634759 4703 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.635259 4703 server.go:460] "Adding debug handlers to kubelet server" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.635027 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="200ms" Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.642384 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.642655 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.643265 4703 factory.go:55] Registering systemd factory Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.644793 4703 factory.go:221] Registration of the systemd container factory successfully Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.656784 4703 factory.go:153] Registering CRI-O factory Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.656912 4703 factory.go:221] Registration of the crio container factory successfully Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.657044 4703 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.657099 4703 factory.go:103] Registering Raw factory Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.657130 4703 manager.go:1196] Started watching for new ooms in manager Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.656378 4703 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.19:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b2edce89dd4da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.630762714 +0000 UTC m=+0.598178400,LastTimestamp:2026-03-09 13:20:04.630762714 +0000 UTC m=+0.598178400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.658927 4703 manager.go:319] Starting recovery of all containers Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665682 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665797 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665830 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665892 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665920 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665946 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665972 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.665999 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666028 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666063 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666090 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666116 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666142 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666175 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666201 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666231 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666256 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666281 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666306 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666335 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666400 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666429 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666456 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666484 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666509 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666535 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666569 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666597 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666623 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666647 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666673 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666735 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666760 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666784 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666808 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666838 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666897 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666928 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666953 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.666976 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667000 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667223 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667252 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667280 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667306 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667335 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667361 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667390 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667506 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667539 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667569 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667596 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667630 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667658 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667685 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667713 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667739 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667766 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667792 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667819 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667875 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.667920 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668181 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668214 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668244 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668274 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668302 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668329 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668356 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668387 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668415 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668447 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668473 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668500 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668528 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668552 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668581 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668608 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668635 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668662 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668690 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668716 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668744 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668773 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668803 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668830 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668949 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.668983 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669010 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669037 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669064 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669090 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669118 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669145 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669175 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669204 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669230 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669256 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669281 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669310 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669338 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.669368 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671517 4703 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671578 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671609 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671653 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671687 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671718 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671749 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671782 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671813 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671876 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671913 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671946 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.671977 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672005 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672031 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672058 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672085 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672114 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672142 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672169 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672197 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672223 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672251 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672281 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672307 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672332 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672358 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672385 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672412 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672437 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672463 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672491 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672517 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672547 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672574 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672598 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672632 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672658 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672684 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672713 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672741 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672769 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672794 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672821 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672929 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672964 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.672989 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673016 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673040 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673070 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673095 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673120 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673147 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673171 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673210 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673237 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673265 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673289 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673314 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673338 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673365 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673390 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673418 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673442 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673468 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673496 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673529 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673555 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673588 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673618 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673645 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673675 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673703 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673729 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673755 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673782 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673809 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673872 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673903 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673933 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673958 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.673984 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674010 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674036 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674062 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674087 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674111 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674135 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674160 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674186 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674212 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674239 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674265 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674292 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674318 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674344 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674368 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674401 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674428 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674452 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674479 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674510 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674535 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674562 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674587 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674613 4703 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674637 4703 reconstruct.go:97] "Volume reconstruction finished" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.674654 4703 reconciler.go:26] "Reconciler: start to sync state" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.694246 4703 manager.go:324] Recovery completed Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.699771 4703 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.705531 4703 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.705592 4703 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.705630 4703 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.705708 4703 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 13:20:04 crc kubenswrapper[4703]: W0309 13:20:04.706801 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.706918 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.709661 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.711122 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.711167 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.711184 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.713270 4703 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.713306 4703 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.713342 4703 state_mem.go:36] "Initialized new in-memory state store" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.727100 4703 policy_none.go:49] "None policy: Start" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.728170 4703 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.728195 4703 state_mem.go:35] "Initializing new in-memory state store" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.735152 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.779335 4703 manager.go:334] "Starting Device Plugin manager" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.779382 4703 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.779416 4703 server.go:79] "Starting device plugin registration server" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.780659 4703 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.780699 4703 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.780920 4703 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.781101 4703 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.781136 4703 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.788291 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.806559 4703 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.806689 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.807927 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.808005 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.808071 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.808269 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.808701 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.808774 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.809406 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.809433 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.809442 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.809555 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.809732 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.809823 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810117 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810181 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810206 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810353 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810378 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810390 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810533 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810668 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.810709 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811091 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811140 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811173 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811705 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811735 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811733 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811773 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811747 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.811799 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.812081 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.812193 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.812246 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813193 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813218 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813227 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813238 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813258 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813268 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813377 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.813404 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.814050 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.814081 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.814094 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.842184 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="400ms" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878239 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878275 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878296 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878312 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878328 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878342 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878355 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878371 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878387 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878404 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878456 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.878906 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.879022 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.879135 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.880069 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.881401 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.884061 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.884105 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.884123 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.884156 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:04 crc kubenswrapper[4703]: E0309 13:20:04.884617 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.19:6443: connect: connection refused" node="crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982230 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982318 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982353 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982387 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982412 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982432 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982452 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982473 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982479 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982546 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982494 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982596 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982611 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982627 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982653 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982680 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982688 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982712 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982719 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982742 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982748 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982777 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982805 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982658 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982868 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982901 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982931 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982959 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982988 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:04 crc kubenswrapper[4703]: I0309 13:20:04.982775 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.085422 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.086934 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.086973 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.086985 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.087010 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:05 crc kubenswrapper[4703]: E0309 13:20:05.087406 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.19:6443: connect: connection refused" node="crc" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.139279 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.161361 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.167651 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.172691 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.184217 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-653ca923209b9644494c9efbf7467edda4a334375ee38d17aae105b8784950c0 WatchSource:0}: Error finding container 653ca923209b9644494c9efbf7467edda4a334375ee38d17aae105b8784950c0: Status 404 returned error can't find the container with id 653ca923209b9644494c9efbf7467edda4a334375ee38d17aae105b8784950c0 Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.191415 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.205707 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5ceb47041b1f429b42ed7a7ff410669ef0d058eec53b7d863c242a021a674a3b WatchSource:0}: Error finding container 5ceb47041b1f429b42ed7a7ff410669ef0d058eec53b7d863c242a021a674a3b: Status 404 returned error can't find the container with id 5ceb47041b1f429b42ed7a7ff410669ef0d058eec53b7d863c242a021a674a3b Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.210292 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c41c7299d0db2c289184994db318976df4377ef2253558045faf16ffca1d9d9e WatchSource:0}: Error finding container c41c7299d0db2c289184994db318976df4377ef2253558045faf16ffca1d9d9e: Status 404 returned error can't find the container with id c41c7299d0db2c289184994db318976df4377ef2253558045faf16ffca1d9d9e Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.210737 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-ed1015b1fce80cd2d746c9b80d63880b56b367bc9fd23d8060ceaf870e3746cc WatchSource:0}: Error finding container ed1015b1fce80cd2d746c9b80d63880b56b367bc9fd23d8060ceaf870e3746cc: Status 404 returned error can't find the container with id ed1015b1fce80cd2d746c9b80d63880b56b367bc9fd23d8060ceaf870e3746cc Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.221338 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-51427185409fa11c20013cef4ef572964d2b64ea3a29d7e190d5d13deffe9e5b WatchSource:0}: Error finding container 51427185409fa11c20013cef4ef572964d2b64ea3a29d7e190d5d13deffe9e5b: Status 404 returned error can't find the container with id 51427185409fa11c20013cef4ef572964d2b64ea3a29d7e190d5d13deffe9e5b Mar 09 13:20:05 crc kubenswrapper[4703]: E0309 13:20:05.243639 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="800ms" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.488408 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.490675 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.490736 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.490754 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.490789 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:05 crc kubenswrapper[4703]: E0309 13:20:05.491474 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.19:6443: connect: connection refused" node="crc" Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.597163 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:05 crc kubenswrapper[4703]: E0309 13:20:05.597237 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.634654 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.675610 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:05 crc kubenswrapper[4703]: E0309 13:20:05.675711 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.712140 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ed1015b1fce80cd2d746c9b80d63880b56b367bc9fd23d8060ceaf870e3746cc"} Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.713153 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c41c7299d0db2c289184994db318976df4377ef2253558045faf16ffca1d9d9e"} Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.714551 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5ceb47041b1f429b42ed7a7ff410669ef0d058eec53b7d863c242a021a674a3b"} Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.715723 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"653ca923209b9644494c9efbf7467edda4a334375ee38d17aae105b8784950c0"} Mar 09 13:20:05 crc kubenswrapper[4703]: I0309 13:20:05.716586 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"51427185409fa11c20013cef4ef572964d2b64ea3a29d7e190d5d13deffe9e5b"} Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.718475 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:05 crc kubenswrapper[4703]: E0309 13:20:05.718554 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:05 crc kubenswrapper[4703]: W0309 13:20:05.882151 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:05 crc kubenswrapper[4703]: E0309 13:20:05.882220 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:06 crc kubenswrapper[4703]: E0309 13:20:06.044520 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="1.6s" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.292564 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.294230 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.294298 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.294320 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.294365 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:06 crc kubenswrapper[4703]: E0309 13:20:06.294974 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.19:6443: connect: connection refused" node="crc" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.634747 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.642885 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:20:06 crc kubenswrapper[4703]: E0309 13:20:06.644014 4703 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.724175 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.724278 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.724309 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.724327 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.724234 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.725273 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.725427 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.725580 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.726713 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.726754 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.726545 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54" exitCode=0 Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.727885 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.727931 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.727949 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.730066 4703 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a" exitCode=0 Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.730160 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.730117 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.730170 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.731380 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.731407 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.731419 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.732284 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.732302 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.732313 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.732477 4703 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc" exitCode=0 Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.732744 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.733315 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.734977 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.735027 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.735046 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.735695 4703 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec" exitCode=0 Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.735757 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec"} Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.736176 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.737429 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.737478 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:06 crc kubenswrapper[4703]: I0309 13:20:06.737500 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.634119 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:07 crc kubenswrapper[4703]: E0309 13:20:07.645782 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="3.2s" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.741116 4703 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3" exitCode=0 Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.741192 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.741289 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.742227 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.742270 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.742285 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.743135 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.743124 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.744101 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.744137 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.744149 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.746956 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.746976 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.746986 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.746996 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.749179 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.749218 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.749230 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc"} Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.749231 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.749242 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:07 crc kubenswrapper[4703]: W0309 13:20:07.749774 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:07 crc kubenswrapper[4703]: E0309 13:20:07.749933 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.750268 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.750302 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.750316 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.750272 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.750422 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.750442 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:07 crc kubenswrapper[4703]: W0309 13:20:07.875099 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.19:6443: connect: connection refused Mar 09 13:20:07 crc kubenswrapper[4703]: E0309 13:20:07.875161 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.19:6443: connect: connection refused" logger="UnhandledError" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.896147 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.897553 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.897588 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.897601 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:07 crc kubenswrapper[4703]: I0309 13:20:07.897631 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:07 crc kubenswrapper[4703]: E0309 13:20:07.898199 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.19:6443: connect: connection refused" node="crc" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.133712 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.744534 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.755147 4703 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c" exitCode=0 Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.755249 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c"} Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.755469 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.756803 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.756908 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.756942 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.770757 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.771571 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.772253 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d050a396d453d7ab16fd9bed01e9e0c6116d186a576d77b421b4d7377070137"} Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.772402 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.773027 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.774369 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.774424 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.774447 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.775486 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.775524 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.775544 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.776339 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.776384 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.776406 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.777300 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.777339 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:08 crc kubenswrapper[4703]: I0309 13:20:08.777355 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.534280 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778070 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b"} Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778125 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb"} Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778147 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574"} Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778164 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394"} Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778172 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778179 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec"} Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778238 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.778265 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.779287 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.779347 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.779373 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.779884 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.779927 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.779939 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:09 crc kubenswrapper[4703]: I0309 13:20:09.811430 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.746592 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.780319 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.780383 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.782067 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.782099 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.782107 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.782194 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.782222 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:10 crc kubenswrapper[4703]: I0309 13:20:10.782236 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.098900 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.099986 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.100051 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.100063 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.100089 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.555357 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.782613 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.782689 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.784396 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.784448 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.784467 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.784904 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.785069 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.785203 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.969954 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.970117 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.971465 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.971496 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:11 crc kubenswrapper[4703]: I0309 13:20:11.971507 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:12 crc kubenswrapper[4703]: I0309 13:20:12.346035 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 13:20:12 crc kubenswrapper[4703]: I0309 13:20:12.785229 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:12 crc kubenswrapper[4703]: I0309 13:20:12.786221 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:12 crc kubenswrapper[4703]: I0309 13:20:12.786253 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:12 crc kubenswrapper[4703]: I0309 13:20:12.786264 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:13 crc kubenswrapper[4703]: I0309 13:20:13.788729 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:13 crc kubenswrapper[4703]: I0309 13:20:13.790115 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:13 crc kubenswrapper[4703]: I0309 13:20:13.790167 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:13 crc kubenswrapper[4703]: I0309 13:20:13.790178 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.062389 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.062626 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.065509 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.065577 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.065658 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.071084 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.135405 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:14 crc kubenswrapper[4703]: E0309 13:20:14.788652 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.790959 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.792370 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.792429 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:14 crc kubenswrapper[4703]: I0309 13:20:14.792562 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:15 crc kubenswrapper[4703]: I0309 13:20:15.793545 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:15 crc kubenswrapper[4703]: I0309 13:20:15.794613 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:15 crc kubenswrapper[4703]: I0309 13:20:15.794672 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:15 crc kubenswrapper[4703]: I0309 13:20:15.794697 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:15 crc kubenswrapper[4703]: I0309 13:20:15.800435 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:16 crc kubenswrapper[4703]: I0309 13:20:16.795833 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:16 crc kubenswrapper[4703]: I0309 13:20:16.796959 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:16 crc kubenswrapper[4703]: I0309 13:20:16.797011 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:16 crc kubenswrapper[4703]: I0309 13:20:16.797022 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:17 crc kubenswrapper[4703]: I0309 13:20:17.136165 4703 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:20:17 crc kubenswrapper[4703]: I0309 13:20:17.136230 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:20:18 crc kubenswrapper[4703]: W0309 13:20:18.635255 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 13:20:18 crc kubenswrapper[4703]: I0309 13:20:18.635390 4703 trace.go:236] Trace[1082532631]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 13:20:08.633) (total time: 10001ms): Mar 09 13:20:18 crc kubenswrapper[4703]: Trace[1082532631]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:20:18.635) Mar 09 13:20:18 crc kubenswrapper[4703]: Trace[1082532631]: [10.001452561s] [10.001452561s] END Mar 09 13:20:18 crc kubenswrapper[4703]: E0309 13:20:18.635425 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 13:20:19 crc kubenswrapper[4703]: W0309 13:20:19.427412 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.427513 4703 trace.go:236] Trace[1626047222]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 13:20:08.652) (total time: 10774ms): Mar 09 13:20:19 crc kubenswrapper[4703]: Trace[1626047222]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10774ms (13:20:19.427) Mar 09 13:20:19 crc kubenswrapper[4703]: Trace[1626047222]: [10.774578181s] [10.774578181s] END Mar 09 13:20:19 crc kubenswrapper[4703]: E0309 13:20:19.427542 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.427509 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.432190 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54872->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.432254 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54872->192.168.126.11:17697: read: connection reset by peer" Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.811854 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.811927 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.839980 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 09 13:20:19 crc kubenswrapper[4703]: I0309 13:20:19.840043 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.225556 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.228400 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d050a396d453d7ab16fd9bed01e9e0c6116d186a576d77b421b4d7377070137" exitCode=255 Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.228463 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9d050a396d453d7ab16fd9bed01e9e0c6116d186a576d77b421b4d7377070137"} Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.228669 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.230065 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.230117 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.230139 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.231042 4703 scope.go:117] "RemoveContainer" containerID="9d050a396d453d7ab16fd9bed01e9e0c6116d186a576d77b421b4d7377070137" Mar 09 13:20:20 crc kubenswrapper[4703]: E0309 13:20:20.250469 4703 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:20 crc kubenswrapper[4703]: E0309 13:20:20.257666 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 13:20:20 crc kubenswrapper[4703]: E0309 13:20:20.260409 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 13:20:20 crc kubenswrapper[4703]: E0309 13:20:20.265663 4703 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b2edce89dd4da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.630762714 +0000 UTC m=+0.598178400,LastTimestamp:2026-03-09 13:20:04.630762714 +0000 UTC m=+0.598178400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.265973 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z Mar 09 13:20:20 crc kubenswrapper[4703]: W0309 13:20:20.270094 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z Mar 09 13:20:20 crc kubenswrapper[4703]: E0309 13:20:20.270204 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:20 crc kubenswrapper[4703]: W0309 13:20:20.270508 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z Mar 09 13:20:20 crc kubenswrapper[4703]: E0309 13:20:20.270589 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.270620 4703 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.270672 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 13:20:20 crc kubenswrapper[4703]: I0309 13:20:20.637350 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:20Z is after 2026-02-23T05:33:13Z Mar 09 13:20:21 crc kubenswrapper[4703]: I0309 13:20:21.232583 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 13:20:21 crc kubenswrapper[4703]: I0309 13:20:21.233972 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d"} Mar 09 13:20:21 crc kubenswrapper[4703]: I0309 13:20:21.234106 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:21 crc kubenswrapper[4703]: I0309 13:20:21.234770 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:21 crc kubenswrapper[4703]: I0309 13:20:21.234801 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:21 crc kubenswrapper[4703]: I0309 13:20:21.234809 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:21 crc kubenswrapper[4703]: I0309 13:20:21.638648 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:21Z is after 2026-02-23T05:33:13Z Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.238278 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.239039 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.241553 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" exitCode=255 Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.241620 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d"} Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.241701 4703 scope.go:117] "RemoveContainer" containerID="9d050a396d453d7ab16fd9bed01e9e0c6116d186a576d77b421b4d7377070137" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.241875 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.243049 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.243100 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.243113 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.243709 4703 scope.go:117] "RemoveContainer" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" Mar 09 13:20:22 crc kubenswrapper[4703]: E0309 13:20:22.243886 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.386885 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.387291 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.389501 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.389569 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.389632 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.403804 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 13:20:22 crc kubenswrapper[4703]: I0309 13:20:22.639648 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:22Z is after 2026-02-23T05:33:13Z Mar 09 13:20:23 crc kubenswrapper[4703]: I0309 13:20:23.246571 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:20:23 crc kubenswrapper[4703]: I0309 13:20:23.248911 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:23 crc kubenswrapper[4703]: I0309 13:20:23.249898 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:23 crc kubenswrapper[4703]: I0309 13:20:23.249944 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:23 crc kubenswrapper[4703]: I0309 13:20:23.249960 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:23 crc kubenswrapper[4703]: I0309 13:20:23.638100 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:23Z is after 2026-02-23T05:33:13Z Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.639700 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:24Z is after 2026-02-23T05:33:13Z Mar 09 13:20:24 crc kubenswrapper[4703]: W0309 13:20:24.751464 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:24Z is after 2026-02-23T05:33:13Z Mar 09 13:20:24 crc kubenswrapper[4703]: E0309 13:20:24.751537 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:24 crc kubenswrapper[4703]: E0309 13:20:24.788807 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.820925 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.821121 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.822893 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.822953 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.822973 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.823724 4703 scope.go:117] "RemoveContainer" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" Mar 09 13:20:24 crc kubenswrapper[4703]: E0309 13:20:24.824052 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:24 crc kubenswrapper[4703]: I0309 13:20:24.828057 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:24 crc kubenswrapper[4703]: W0309 13:20:24.840972 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:24Z is after 2026-02-23T05:33:13Z Mar 09 13:20:24 crc kubenswrapper[4703]: E0309 13:20:24.841062 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 13:20:25 crc kubenswrapper[4703]: I0309 13:20:25.253975 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:25 crc kubenswrapper[4703]: I0309 13:20:25.255039 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:25 crc kubenswrapper[4703]: I0309 13:20:25.255100 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:25 crc kubenswrapper[4703]: I0309 13:20:25.255117 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:25 crc kubenswrapper[4703]: I0309 13:20:25.255962 4703 scope.go:117] "RemoveContainer" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" Mar 09 13:20:25 crc kubenswrapper[4703]: E0309 13:20:25.256196 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:25 crc kubenswrapper[4703]: I0309 13:20:25.638007 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:25Z is after 2026-02-23T05:33:13Z Mar 09 13:20:26 crc kubenswrapper[4703]: I0309 13:20:26.638993 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:20:26Z is after 2026-02-23T05:33:13Z Mar 09 13:20:26 crc kubenswrapper[4703]: I0309 13:20:26.661212 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:26 crc kubenswrapper[4703]: I0309 13:20:26.662346 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:26 crc kubenswrapper[4703]: I0309 13:20:26.662381 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:26 crc kubenswrapper[4703]: I0309 13:20:26.662393 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:26 crc kubenswrapper[4703]: I0309 13:20:26.662416 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:26 crc kubenswrapper[4703]: E0309 13:20:26.666388 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:20:26 crc kubenswrapper[4703]: E0309 13:20:26.666563 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:20:27 crc kubenswrapper[4703]: I0309 13:20:27.136226 4703 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:20:27 crc kubenswrapper[4703]: I0309 13:20:27.136324 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:20:27 crc kubenswrapper[4703]: I0309 13:20:27.642344 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.560010 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.560409 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.561638 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.561744 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.561718 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.561821 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.562735 4703 scope.go:117] "RemoveContainer" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" Mar 09 13:20:28 crc kubenswrapper[4703]: E0309 13:20:28.563010 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.582205 4703 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 13:20:28 crc kubenswrapper[4703]: I0309 13:20:28.642327 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:28 crc kubenswrapper[4703]: W0309 13:20:28.870313 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 13:20:28 crc kubenswrapper[4703]: E0309 13:20:28.870385 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:29 crc kubenswrapper[4703]: I0309 13:20:29.640042 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:29 crc kubenswrapper[4703]: I0309 13:20:29.840485 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:29 crc kubenswrapper[4703]: I0309 13:20:29.840693 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:29 crc kubenswrapper[4703]: I0309 13:20:29.842203 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:29 crc kubenswrapper[4703]: I0309 13:20:29.842271 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:29 crc kubenswrapper[4703]: I0309 13:20:29.842296 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:29 crc kubenswrapper[4703]: I0309 13:20:29.843112 4703 scope.go:117] "RemoveContainer" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" Mar 09 13:20:29 crc kubenswrapper[4703]: E0309 13:20:29.843390 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.273205 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edce89dd4da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.630762714 +0000 UTC m=+0.598178400,LastTimestamp:2026-03-09 13:20:04.630762714 +0000 UTC m=+0.598178400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.280214 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.286037 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.289888 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced691384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,LastTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.294167 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edcf1bc4ca5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.783754405 +0000 UTC m=+0.751170101,LastTimestamp:2026-03-09 13:20:04.783754405 +0000 UTC m=+0.751170101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.298591 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced688511\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.80798471 +0000 UTC m=+0.775400446,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.303052 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced68de6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.808017797 +0000 UTC m=+0.775433493,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.308934 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced691384\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced691384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,LastTimestamp:2026-03-09 13:20:04.808080622 +0000 UTC m=+0.775496318,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.313527 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced688511\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.809425595 +0000 UTC m=+0.776841281,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.319612 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced68de6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.809438804 +0000 UTC m=+0.776854490,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.325684 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced691384\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced691384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,LastTimestamp:2026-03-09 13:20:04.809447233 +0000 UTC m=+0.776862919,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.331606 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced688511\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.810157502 +0000 UTC m=+0.777573228,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.338486 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced68de6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.810197698 +0000 UTC m=+0.777613424,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.344838 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced691384\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced691384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,LastTimestamp:2026-03-09 13:20:04.810219506 +0000 UTC m=+0.777635242,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.351747 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced688511\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.810372033 +0000 UTC m=+0.777787729,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.358159 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced68de6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.810385852 +0000 UTC m=+0.777801548,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.364600 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced691384\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced691384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,LastTimestamp:2026-03-09 13:20:04.810417909 +0000 UTC m=+0.777833605,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.371509 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced688511\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.811127397 +0000 UTC m=+0.778543123,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.380661 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced68de6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.811155435 +0000 UTC m=+0.778571171,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.386795 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced691384\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced691384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,LastTimestamp:2026-03-09 13:20:04.811186882 +0000 UTC m=+0.778602618,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.394006 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced688511\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.811729335 +0000 UTC m=+0.779145031,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.399069 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced68de6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.811742194 +0000 UTC m=+0.779157890,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.403472 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced688511\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced688511 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711154961 +0000 UTC m=+0.678570667,LastTimestamp:2026-03-09 13:20:04.811760532 +0000 UTC m=+0.779176268,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.410269 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced68de6f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced68de6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711177839 +0000 UTC m=+0.678593535,LastTimestamp:2026-03-09 13:20:04.81178792 +0000 UTC m=+0.779203656,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.416165 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b2edced691384\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b2edced691384 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:04.711191428 +0000 UTC m=+0.678607124,LastTimestamp:2026-03-09 13:20:04.811802789 +0000 UTC m=+0.779218475,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.423160 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd0a513877 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.196167287 +0000 UTC m=+1.163583013,LastTimestamp:2026-03-09 13:20:05.196167287 +0000 UTC m=+1.163583013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.429068 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2edd0b2262c2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.209875138 +0000 UTC m=+1.177290844,LastTimestamp:2026-03-09 13:20:05.209875138 +0000 UTC m=+1.177290844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.435999 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edd0b4a2522 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.212480802 +0000 UTC m=+1.179896488,LastTimestamp:2026-03-09 13:20:05.212480802 +0000 UTC m=+1.179896488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.441810 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd0b536c90 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.213088912 +0000 UTC m=+1.180504628,LastTimestamp:2026-03-09 13:20:05.213088912 +0000 UTC m=+1.180504628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.445777 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd0c2cee82 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.22734349 +0000 UTC m=+1.194759186,LastTimestamp:2026-03-09 13:20:05.22734349 +0000 UTC m=+1.194759186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.450233 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd2cae453a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.772690746 +0000 UTC m=+1.740106432,LastTimestamp:2026-03-09 13:20:05.772690746 +0000 UTC m=+1.740106432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.454418 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd2cb19a11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.772909073 +0000 UTC m=+1.740324749,LastTimestamp:2026-03-09 13:20:05.772909073 +0000 UTC m=+1.740324749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.458885 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2edd2cb33856 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.773015126 +0000 UTC m=+1.740430812,LastTimestamp:2026-03-09 13:20:05.773015126 +0000 UTC m=+1.740430812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.462943 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edd2cb719a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.773269415 +0000 UTC m=+1.740685101,LastTimestamp:2026-03-09 13:20:05.773269415 +0000 UTC m=+1.740685101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.469204 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd2ce0c421 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.776000033 +0000 UTC m=+1.743415719,LastTimestamp:2026-03-09 13:20:05.776000033 +0000 UTC m=+1.743415719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.474655 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd2d98b4fc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.78805478 +0000 UTC m=+1.755470466,LastTimestamp:2026-03-09 13:20:05.78805478 +0000 UTC m=+1.755470466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.480772 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edd2db06b06 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.78960871 +0000 UTC m=+1.757024396,LastTimestamp:2026-03-09 13:20:05.78960871 +0000 UTC m=+1.757024396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.486904 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd2db122aa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.789655722 +0000 UTC m=+1.757071408,LastTimestamp:2026-03-09 13:20:05.789655722 +0000 UTC m=+1.757071408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.492560 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2edd2db914a3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.790176419 +0000 UTC m=+1.757592105,LastTimestamp:2026-03-09 13:20:05.790176419 +0000 UTC m=+1.757592105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.498529 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd2db9cc32 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.79022341 +0000 UTC m=+1.757639096,LastTimestamp:2026-03-09 13:20:05.79022341 +0000 UTC m=+1.757639096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.504440 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd2dc93e6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.791235693 +0000 UTC m=+1.758651379,LastTimestamp:2026-03-09 13:20:05.791235693 +0000 UTC m=+1.758651379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.510708 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd3f7d613f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.088253759 +0000 UTC m=+2.055669435,LastTimestamp:2026-03-09 13:20:06.088253759 +0000 UTC m=+2.055669435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.521696 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd403ca67e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.100788862 +0000 UTC m=+2.068204588,LastTimestamp:2026-03-09 13:20:06.100788862 +0000 UTC m=+2.068204588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.526443 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd40549a23 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.102358563 +0000 UTC m=+2.069774289,LastTimestamp:2026-03-09 13:20:06.102358563 +0000 UTC m=+2.069774289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.530997 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd4bb0bf9c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.292946844 +0000 UTC m=+2.260362560,LastTimestamp:2026-03-09 13:20:06.292946844 +0000 UTC m=+2.260362560,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.536959 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd4c5e4ed8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.30432124 +0000 UTC m=+2.271736966,LastTimestamp:2026-03-09 13:20:06.30432124 +0000 UTC m=+2.271736966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.540599 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd4c6c80e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.30525156 +0000 UTC m=+2.272667236,LastTimestamp:2026-03-09 13:20:06.30525156 +0000 UTC m=+2.272667236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.544325 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd56c2e352 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.47868501 +0000 UTC m=+2.446100736,LastTimestamp:2026-03-09 13:20:06.47868501 +0000 UTC m=+2.446100736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.548079 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd57aba2cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.493938381 +0000 UTC m=+2.461354067,LastTimestamp:2026-03-09 13:20:06.493938381 +0000 UTC m=+2.461354067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.554796 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd65bbf648 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.729889352 +0000 UTC m=+2.697305078,LastTimestamp:2026-03-09 13:20:06.729889352 +0000 UTC m=+2.697305078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.561470 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd65e6bcd4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.732692692 +0000 UTC m=+2.700108388,LastTimestamp:2026-03-09 13:20:06.732692692 +0000 UTC m=+2.700108388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.565780 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edd663876fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.738048765 +0000 UTC m=+2.705464491,LastTimestamp:2026-03-09 13:20:06.738048765 +0000 UTC m=+2.705464491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.567620 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2edd667ab332 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.742389554 +0000 UTC m=+2.709805280,LastTimestamp:2026-03-09 13:20:06.742389554 +0000 UTC m=+2.709805280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.569056 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd75491c2b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.990797867 +0000 UTC m=+2.958213553,LastTimestamp:2026-03-09 13:20:06.990797867 +0000 UTC m=+2.958213553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.571585 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2edd754a20b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.990864569 +0000 UTC m=+2.958280265,LastTimestamp:2026-03-09 13:20:06.990864569 +0000 UTC m=+2.958280265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.575230 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edd754bb4a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.990967972 +0000 UTC m=+2.958383658,LastTimestamp:2026-03-09 13:20:06.990967972 +0000 UTC m=+2.958383658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.579522 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd7552ff25 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.991445797 +0000 UTC m=+2.958861483,LastTimestamp:2026-03-09 13:20:06.991445797 +0000 UTC m=+2.958861483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.583363 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd75d8b635 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.000208949 +0000 UTC m=+2.967624645,LastTimestamp:2026-03-09 13:20:07.000208949 +0000 UTC m=+2.967624645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.587258 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd75eab7d9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.001389017 +0000 UTC m=+2.968804713,LastTimestamp:2026-03-09 13:20:07.001389017 +0000 UTC m=+2.968804713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.591117 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd761c60ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.004643562 +0000 UTC m=+2.972059248,LastTimestamp:2026-03-09 13:20:07.004643562 +0000 UTC m=+2.972059248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.595824 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edd763eb750 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.006893904 +0000 UTC m=+2.974309590,LastTimestamp:2026-03-09 13:20:07.006893904 +0000 UTC m=+2.974309590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.600756 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd76622faf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.009218479 +0000 UTC m=+2.976634165,LastTimestamp:2026-03-09 13:20:07.009218479 +0000 UTC m=+2.976634165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.604691 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2edd76eae16d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.018176877 +0000 UTC m=+2.985592563,LastTimestamp:2026-03-09 13:20:07.018176877 +0000 UTC m=+2.985592563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.609399 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd81b2821f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.199031839 +0000 UTC m=+3.166447535,LastTimestamp:2026-03-09 13:20:07.199031839 +0000 UTC m=+3.166447535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.615734 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd81b93f21 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.199473441 +0000 UTC m=+3.166889137,LastTimestamp:2026-03-09 13:20:07.199473441 +0000 UTC m=+3.166889137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.621458 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd826d1380 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.211258752 +0000 UTC m=+3.178674438,LastTimestamp:2026-03-09 13:20:07.211258752 +0000 UTC m=+3.178674438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.628608 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd826f3024 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.211397156 +0000 UTC m=+3.178812842,LastTimestamp:2026-03-09 13:20:07.211397156 +0000 UTC m=+3.178812842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.633011 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd8292a051 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.213719633 +0000 UTC m=+3.181135319,LastTimestamp:2026-03-09 13:20:07.213719633 +0000 UTC m=+3.181135319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.637859 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd8292a141 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.213719873 +0000 UTC m=+3.181135559,LastTimestamp:2026-03-09 13:20:07.213719873 +0000 UTC m=+3.181135559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: I0309 13:20:30.638225 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.644498 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd8c8c4391 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.381074833 +0000 UTC m=+3.348490519,LastTimestamp:2026-03-09 13:20:07.381074833 +0000 UTC m=+3.348490519,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.649122 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd8cc4bafd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.384775421 +0000 UTC m=+3.352191107,LastTimestamp:2026-03-09 13:20:07.384775421 +0000 UTC m=+3.352191107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.653730 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2edd8d74f7c3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.396325315 +0000 UTC m=+3.363741021,LastTimestamp:2026-03-09 13:20:07.396325315 +0000 UTC m=+3.363741021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.659751 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd8d925174 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.39824882 +0000 UTC m=+3.365664506,LastTimestamp:2026-03-09 13:20:07.39824882 +0000 UTC m=+3.365664506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.665445 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd8da07675 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.399175797 +0000 UTC m=+3.366591483,LastTimestamp:2026-03-09 13:20:07.399175797 +0000 UTC m=+3.366591483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.669992 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd9644a16e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.54415243 +0000 UTC m=+3.511568116,LastTimestamp:2026-03-09 13:20:07.54415243 +0000 UTC m=+3.511568116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.672413 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd9701a230 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.556538928 +0000 UTC m=+3.523954614,LastTimestamp:2026-03-09 13:20:07.556538928 +0000 UTC m=+3.523954614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.673438 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd971653c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.557895107 +0000 UTC m=+3.525310793,LastTimestamp:2026-03-09 13:20:07.557895107 +0000 UTC m=+3.525310793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.677323 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edda2296d2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.743696171 +0000 UTC m=+3.711111857,LastTimestamp:2026-03-09 13:20:07.743696171 +0000 UTC m=+3.711111857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.681413 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edda2a8b1fe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.752036862 +0000 UTC m=+3.719452548,LastTimestamp:2026-03-09 13:20:07.752036862 +0000 UTC m=+3.719452548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.685782 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edda3aedeb3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.769218739 +0000 UTC m=+3.736634425,LastTimestamp:2026-03-09 13:20:07.769218739 +0000 UTC m=+3.736634425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.689492 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddadc60305 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.938507525 +0000 UTC m=+3.905923221,LastTimestamp:2026-03-09 13:20:07.938507525 +0000 UTC m=+3.905923221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.693437 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddaedac4a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.95664503 +0000 UTC m=+3.924060716,LastTimestamp:2026-03-09 13:20:07.95664503 +0000 UTC m=+3.924060716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.699303 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2edddea9f61d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:08.758752797 +0000 UTC m=+4.726168513,LastTimestamp:2026-03-09 13:20:08.758752797 +0000 UTC m=+4.726168513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.703978 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddeb4aaf18 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:08.970612504 +0000 UTC m=+4.938028190,LastTimestamp:2026-03-09 13:20:08.970612504 +0000 UTC m=+4.938028190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.707667 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddebd446bb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:08.979629755 +0000 UTC m=+4.947045441,LastTimestamp:2026-03-09 13:20:08.979629755 +0000 UTC m=+4.947045441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.713544 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddebe0fd8d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:08.980462989 +0000 UTC m=+4.947878675,LastTimestamp:2026-03-09 13:20:08.980462989 +0000 UTC m=+4.947878675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.717651 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddf694399f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.159981471 +0000 UTC m=+5.127397177,LastTimestamp:2026-03-09 13:20:09.159981471 +0000 UTC m=+5.127397177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.723533 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddf7827e44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.175596612 +0000 UTC m=+5.143012288,LastTimestamp:2026-03-09 13:20:09.175596612 +0000 UTC m=+5.143012288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.728221 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2eddf79662c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.17690029 +0000 UTC m=+5.144315976,LastTimestamp:2026-03-09 13:20:09.17690029 +0000 UTC m=+5.144315976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.732673 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede02b409e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.363392993 +0000 UTC m=+5.330808699,LastTimestamp:2026-03-09 13:20:09.363392993 +0000 UTC m=+5.330808699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.737006 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede034952d8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.373176536 +0000 UTC m=+5.340592232,LastTimestamp:2026-03-09 13:20:09.373176536 +0000 UTC m=+5.340592232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.743301 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede0359ae95 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.374248597 +0000 UTC m=+5.341664303,LastTimestamp:2026-03-09 13:20:09.374248597 +0000 UTC m=+5.341664303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.747991 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede0d3e7d10 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.540238608 +0000 UTC m=+5.507654294,LastTimestamp:2026-03-09 13:20:09.540238608 +0000 UTC m=+5.507654294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.752765 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede0e31e497 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.556190359 +0000 UTC m=+5.523606065,LastTimestamp:2026-03-09 13:20:09.556190359 +0000 UTC m=+5.523606065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.757067 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede0e520566 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.55829591 +0000 UTC m=+5.525711616,LastTimestamp:2026-03-09 13:20:09.55829591 +0000 UTC m=+5.525711616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.762119 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede1a0ec4e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.755215075 +0000 UTC m=+5.722630801,LastTimestamp:2026-03-09 13:20:09.755215075 +0000 UTC m=+5.722630801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.768289 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2ede1b2928a1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:09.773721761 +0000 UTC m=+5.741137457,LastTimestamp:2026-03-09 13:20:09.773721761 +0000 UTC m=+5.741137457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.775886 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:20:30 crc kubenswrapper[4703]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2edfd1ffc2ea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 13:20:30 crc kubenswrapper[4703]: body: Mar 09 13:20:30 crc kubenswrapper[4703]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:17.13620657 +0000 UTC m=+13.103622246,LastTimestamp:2026-03-09 13:20:17.13620657 +0000 UTC m=+13.103622246,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:20:30 crc kubenswrapper[4703]: > Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.780268 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edfd2008736 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:17.136256822 +0000 UTC m=+13.103672498,LastTimestamp:2026-03-09 13:20:17.136256822 +0000 UTC m=+13.103672498,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.782606 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:20:30 crc kubenswrapper[4703]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee05ada67c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:54872->192.168.126.11:17697: read: connection reset by peer Mar 09 13:20:30 crc kubenswrapper[4703]: body: Mar 09 13:20:30 crc kubenswrapper[4703]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:19.432236993 +0000 UTC m=+15.399652679,LastTimestamp:2026-03-09 13:20:19.432236993 +0000 UTC m=+15.399652679,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:20:30 crc kubenswrapper[4703]: > Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.786687 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee05adb11eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54872->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:19.432280555 +0000 UTC m=+15.399696241,LastTimestamp:2026-03-09 13:20:19.432280555 +0000 UTC m=+15.399696241,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.790003 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:20:30 crc kubenswrapper[4703]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee0717bae0a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 13:20:30 crc kubenswrapper[4703]: body: Mar 09 13:20:30 crc kubenswrapper[4703]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:19.811905034 +0000 UTC m=+15.779320720,LastTimestamp:2026-03-09 13:20:19.811905034 +0000 UTC m=+15.779320720,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:20:30 crc kubenswrapper[4703]: > Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.791833 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee0717c5f1d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:19.811950365 +0000 UTC m=+15.779366041,LastTimestamp:2026-03-09 13:20:19.811950365 +0000 UTC m=+15.779366041,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.796766 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 13:20:30 crc kubenswrapper[4703]: &Event{ObjectMeta:{kube-apiserver-crc.189b2ee07328cd49 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 09 13:20:30 crc kubenswrapper[4703]: body: Mar 09 13:20:30 crc kubenswrapper[4703]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:19.840027977 +0000 UTC m=+15.807443663,LastTimestamp:2026-03-09 13:20:19.840027977 +0000 UTC m=+15.807443663,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:20:30 crc kubenswrapper[4703]: > Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.798967 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2ee073295db4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:19.840064948 +0000 UTC m=+15.807480624,LastTimestamp:2026-03-09 13:20:19.840064948 +0000 UTC m=+15.807480624,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.802222 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2edd971653c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2edd971653c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:07.557895107 +0000 UTC m=+3.525310793,LastTimestamp:2026-03-09 13:20:20.235406442 +0000 UTC m=+16.202822158,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.806300 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:20:30 crc kubenswrapper[4703]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2ee2260d12c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:20:30 crc kubenswrapper[4703]: body: Mar 09 13:20:30 crc kubenswrapper[4703]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:27.136299713 +0000 UTC m=+23.103715449,LastTimestamp:2026-03-09 13:20:27.136299713 +0000 UTC m=+23.103715449,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:20:30 crc kubenswrapper[4703]: > Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.812636 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee2260ebfe8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:27.136409576 +0000 UTC m=+23.103825302,LastTimestamp:2026-03-09 13:20:27.136409576 +0000 UTC m=+23.103825302,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:30 crc kubenswrapper[4703]: W0309 13:20:30.966829 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 09 13:20:30 crc kubenswrapper[4703]: E0309 13:20:30.966955 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:31 crc kubenswrapper[4703]: I0309 13:20:31.638723 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:32 crc kubenswrapper[4703]: I0309 13:20:32.641610 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:32 crc kubenswrapper[4703]: W0309 13:20:32.736222 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:32 crc kubenswrapper[4703]: E0309 13:20:32.736324 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:33 crc kubenswrapper[4703]: W0309 13:20:33.030144 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 09 13:20:33 crc kubenswrapper[4703]: E0309 13:20:33.030221 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:33 crc kubenswrapper[4703]: I0309 13:20:33.639766 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:33 crc kubenswrapper[4703]: I0309 13:20:33.666944 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:33 crc kubenswrapper[4703]: I0309 13:20:33.668676 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:33 crc kubenswrapper[4703]: I0309 13:20:33.668727 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:33 crc kubenswrapper[4703]: I0309 13:20:33.668739 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:33 crc kubenswrapper[4703]: I0309 13:20:33.668767 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:33 crc kubenswrapper[4703]: E0309 13:20:33.674328 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:20:33 crc kubenswrapper[4703]: E0309 13:20:33.674557 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:20:34 crc kubenswrapper[4703]: I0309 13:20:34.637491 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:34 crc kubenswrapper[4703]: E0309 13:20:34.789568 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:35 crc kubenswrapper[4703]: I0309 13:20:35.639490 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:36 crc kubenswrapper[4703]: I0309 13:20:36.640027 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.136294 4703 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.136364 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.136472 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.136653 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.138372 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.138474 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.138501 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.139295 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.139615 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c" gracePeriod=30 Mar 09 13:20:37 crc kubenswrapper[4703]: E0309 13:20:37.144179 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee2260d12c1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:20:37 crc kubenswrapper[4703]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2ee2260d12c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:20:37 crc kubenswrapper[4703]: body: Mar 09 13:20:37 crc kubenswrapper[4703]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:27.136299713 +0000 UTC m=+23.103715449,LastTimestamp:2026-03-09 13:20:37.13634573 +0000 UTC m=+33.103761456,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:20:37 crc kubenswrapper[4703]: > Mar 09 13:20:37 crc kubenswrapper[4703]: E0309 13:20:37.151694 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee2260ebfe8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee2260ebfe8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:27.136409576 +0000 UTC m=+23.103825302,LastTimestamp:2026-03-09 13:20:37.136396641 +0000 UTC m=+33.103812367,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:37 crc kubenswrapper[4703]: E0309 13:20:37.161877 4703 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee47a4b1dda openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:37.139586522 +0000 UTC m=+33.107002248,LastTimestamp:2026-03-09 13:20:37.139586522 +0000 UTC m=+33.107002248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:37 crc kubenswrapper[4703]: E0309 13:20:37.267350 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2edd2dc93e6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd2dc93e6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:05.791235693 +0000 UTC m=+1.758651379,LastTimestamp:2026-03-09 13:20:37.262888195 +0000 UTC m=+33.230303911,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.298887 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.300119 4703 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c" exitCode=255 Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.300213 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c"} Mar 09 13:20:37 crc kubenswrapper[4703]: E0309 13:20:37.501531 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2edd3f7d613f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd3f7d613f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.088253759 +0000 UTC m=+2.055669435,LastTimestamp:2026-03-09 13:20:37.491040392 +0000 UTC m=+33.458456108,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:37 crc kubenswrapper[4703]: E0309 13:20:37.509172 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2edd403ca67e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2edd403ca67e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:06.100788862 +0000 UTC m=+2.068204588,LastTimestamp:2026-03-09 13:20:37.504155757 +0000 UTC m=+33.471571473,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:37 crc kubenswrapper[4703]: I0309 13:20:37.638108 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:38 crc kubenswrapper[4703]: I0309 13:20:38.305907 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:20:38 crc kubenswrapper[4703]: I0309 13:20:38.306426 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95"} Mar 09 13:20:38 crc kubenswrapper[4703]: I0309 13:20:38.306590 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:38 crc kubenswrapper[4703]: I0309 13:20:38.307956 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:38 crc kubenswrapper[4703]: I0309 13:20:38.307995 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:38 crc kubenswrapper[4703]: I0309 13:20:38.308008 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:38 crc kubenswrapper[4703]: I0309 13:20:38.639646 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:39 crc kubenswrapper[4703]: I0309 13:20:39.308978 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:39 crc kubenswrapper[4703]: I0309 13:20:39.311237 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:39 crc kubenswrapper[4703]: I0309 13:20:39.311325 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:39 crc kubenswrapper[4703]: I0309 13:20:39.311353 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:39 crc kubenswrapper[4703]: I0309 13:20:39.640665 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:40 crc kubenswrapper[4703]: I0309 13:20:40.640495 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:40 crc kubenswrapper[4703]: I0309 13:20:40.675879 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:40 crc kubenswrapper[4703]: E0309 13:20:40.676998 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:20:40 crc kubenswrapper[4703]: I0309 13:20:40.677612 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:40 crc kubenswrapper[4703]: I0309 13:20:40.677702 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:40 crc kubenswrapper[4703]: I0309 13:20:40.677726 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:40 crc kubenswrapper[4703]: I0309 13:20:40.677768 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:40 crc kubenswrapper[4703]: E0309 13:20:40.684940 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:20:41 crc kubenswrapper[4703]: I0309 13:20:41.641105 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:41 crc kubenswrapper[4703]: I0309 13:20:41.970627 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:41 crc kubenswrapper[4703]: I0309 13:20:41.970925 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:41 crc kubenswrapper[4703]: I0309 13:20:41.972528 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:41 crc kubenswrapper[4703]: I0309 13:20:41.972602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:41 crc kubenswrapper[4703]: I0309 13:20:41.972631 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:42 crc kubenswrapper[4703]: I0309 13:20:42.640962 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:43 crc kubenswrapper[4703]: I0309 13:20:43.641329 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.136187 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.136506 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.138368 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.138450 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.138470 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.639287 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.706767 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.709733 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.709808 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.709830 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:44 crc kubenswrapper[4703]: I0309 13:20:44.710812 4703 scope.go:117] "RemoveContainer" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" Mar 09 13:20:44 crc kubenswrapper[4703]: E0309 13:20:44.790528 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:45 crc kubenswrapper[4703]: I0309 13:20:45.329021 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:20:45 crc kubenswrapper[4703]: I0309 13:20:45.330990 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0"} Mar 09 13:20:45 crc kubenswrapper[4703]: I0309 13:20:45.331258 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:45 crc kubenswrapper[4703]: I0309 13:20:45.332743 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:45 crc kubenswrapper[4703]: I0309 13:20:45.332785 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:45 crc kubenswrapper[4703]: I0309 13:20:45.332806 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:45 crc kubenswrapper[4703]: I0309 13:20:45.641421 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.337981 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.339158 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.343554 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0" exitCode=255 Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.343598 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0"} Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.343634 4703 scope.go:117] "RemoveContainer" containerID="62f0c4102f9c087bd76a8656e204eaac378a2881f4ac903da0dd6e748a48979d" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.343796 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.344630 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.344668 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.344679 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.345387 4703 scope.go:117] "RemoveContainer" containerID="ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0" Mar 09 13:20:46 crc kubenswrapper[4703]: E0309 13:20:46.345728 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:46 crc kubenswrapper[4703]: I0309 13:20:46.636732 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.136537 4703 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.136621 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 13:20:47 crc kubenswrapper[4703]: E0309 13:20:47.141557 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee2260d12c1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 13:20:47 crc kubenswrapper[4703]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2ee2260d12c1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 13:20:47 crc kubenswrapper[4703]: body: Mar 09 13:20:47 crc kubenswrapper[4703]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:27.136299713 +0000 UTC m=+23.103715449,LastTimestamp:2026-03-09 13:20:47.136599567 +0000 UTC m=+43.104015283,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 13:20:47 crc kubenswrapper[4703]: > Mar 09 13:20:47 crc kubenswrapper[4703]: E0309 13:20:47.146296 4703 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2ee2260ebfe8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2ee2260ebfe8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:20:27.136409576 +0000 UTC m=+23.103825302,LastTimestamp:2026-03-09 13:20:47.136654858 +0000 UTC m=+43.104070574,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:20:47 crc kubenswrapper[4703]: W0309 13:20:47.321383 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:47 crc kubenswrapper[4703]: E0309 13:20:47.321455 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.349273 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.642111 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:47 crc kubenswrapper[4703]: E0309 13:20:47.683324 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.685426 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.687250 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.687299 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.687317 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:47 crc kubenswrapper[4703]: I0309 13:20:47.687349 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:47 crc kubenswrapper[4703]: E0309 13:20:47.694198 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:20:48 crc kubenswrapper[4703]: I0309 13:20:48.560021 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:48 crc kubenswrapper[4703]: I0309 13:20:48.560247 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:48 crc kubenswrapper[4703]: I0309 13:20:48.561720 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:48 crc kubenswrapper[4703]: I0309 13:20:48.561775 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:48 crc kubenswrapper[4703]: I0309 13:20:48.561786 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:48 crc kubenswrapper[4703]: I0309 13:20:48.562346 4703 scope.go:117] "RemoveContainer" containerID="ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0" Mar 09 13:20:48 crc kubenswrapper[4703]: E0309 13:20:48.562532 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:48 crc kubenswrapper[4703]: I0309 13:20:48.641725 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:49 crc kubenswrapper[4703]: W0309 13:20:49.208334 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 09 13:20:49 crc kubenswrapper[4703]: E0309 13:20:49.208461 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:49 crc kubenswrapper[4703]: I0309 13:20:49.640354 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:49 crc kubenswrapper[4703]: I0309 13:20:49.839696 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:20:49 crc kubenswrapper[4703]: I0309 13:20:49.840194 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:49 crc kubenswrapper[4703]: I0309 13:20:49.841693 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:49 crc kubenswrapper[4703]: I0309 13:20:49.841835 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:49 crc kubenswrapper[4703]: I0309 13:20:49.841976 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:49 crc kubenswrapper[4703]: I0309 13:20:49.842695 4703 scope.go:117] "RemoveContainer" containerID="ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0" Mar 09 13:20:49 crc kubenswrapper[4703]: E0309 13:20:49.843060 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:20:50 crc kubenswrapper[4703]: I0309 13:20:50.640487 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:51 crc kubenswrapper[4703]: I0309 13:20:51.641895 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:52 crc kubenswrapper[4703]: I0309 13:20:52.636901 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:52 crc kubenswrapper[4703]: W0309 13:20:52.785640 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 13:20:52 crc kubenswrapper[4703]: E0309 13:20:52.785722 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:53 crc kubenswrapper[4703]: I0309 13:20:53.640352 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.142222 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.142517 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.144350 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.144408 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.144429 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.148567 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.371379 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.372600 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.372687 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.372712 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.637929 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:54 crc kubenswrapper[4703]: E0309 13:20:54.689974 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.695043 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.696071 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.696108 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.696121 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:54 crc kubenswrapper[4703]: I0309 13:20:54.696143 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:20:54 crc kubenswrapper[4703]: E0309 13:20:54.702036 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:20:54 crc kubenswrapper[4703]: E0309 13:20:54.791666 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:20:55 crc kubenswrapper[4703]: I0309 13:20:55.638746 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:56 crc kubenswrapper[4703]: I0309 13:20:56.638616 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:57 crc kubenswrapper[4703]: I0309 13:20:57.638503 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:58 crc kubenswrapper[4703]: W0309 13:20:58.230700 4703 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 09 13:20:58 crc kubenswrapper[4703]: E0309 13:20:58.231152 4703 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 13:20:58 crc kubenswrapper[4703]: I0309 13:20:58.641547 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:20:58 crc kubenswrapper[4703]: I0309 13:20:58.748833 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 13:20:58 crc kubenswrapper[4703]: I0309 13:20:58.748984 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:20:58 crc kubenswrapper[4703]: I0309 13:20:58.749873 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:20:58 crc kubenswrapper[4703]: I0309 13:20:58.749916 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:20:58 crc kubenswrapper[4703]: I0309 13:20:58.749930 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:20:59 crc kubenswrapper[4703]: I0309 13:20:59.637731 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:00 crc kubenswrapper[4703]: I0309 13:21:00.640642 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:01 crc kubenswrapper[4703]: I0309 13:21:01.641467 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:01 crc kubenswrapper[4703]: E0309 13:21:01.697564 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:01 crc kubenswrapper[4703]: I0309 13:21:01.702437 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:01 crc kubenswrapper[4703]: I0309 13:21:01.704056 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:01 crc kubenswrapper[4703]: I0309 13:21:01.704118 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:01 crc kubenswrapper[4703]: I0309 13:21:01.704140 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:01 crc kubenswrapper[4703]: I0309 13:21:01.704182 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:01 crc kubenswrapper[4703]: E0309 13:21:01.709817 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:02 crc kubenswrapper[4703]: I0309 13:21:02.639468 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:03 crc kubenswrapper[4703]: I0309 13:21:03.638721 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:04 crc kubenswrapper[4703]: I0309 13:21:04.639447 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:04 crc kubenswrapper[4703]: I0309 13:21:04.706507 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:04 crc kubenswrapper[4703]: I0309 13:21:04.707962 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:04 crc kubenswrapper[4703]: I0309 13:21:04.708218 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:04 crc kubenswrapper[4703]: I0309 13:21:04.708410 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:04 crc kubenswrapper[4703]: I0309 13:21:04.709771 4703 scope.go:117] "RemoveContainer" containerID="ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0" Mar 09 13:21:04 crc kubenswrapper[4703]: E0309 13:21:04.710458 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:04 crc kubenswrapper[4703]: E0309 13:21:04.792614 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:05 crc kubenswrapper[4703]: I0309 13:21:05.639510 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:06 crc kubenswrapper[4703]: I0309 13:21:06.640722 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:07 crc kubenswrapper[4703]: I0309 13:21:07.638864 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:08 crc kubenswrapper[4703]: I0309 13:21:08.640679 4703 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 13:21:08 crc kubenswrapper[4703]: E0309 13:21:08.699883 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 13:21:08 crc kubenswrapper[4703]: I0309 13:21:08.710163 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:08 crc kubenswrapper[4703]: I0309 13:21:08.711172 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:08 crc kubenswrapper[4703]: I0309 13:21:08.711226 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:08 crc kubenswrapper[4703]: I0309 13:21:08.711239 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:08 crc kubenswrapper[4703]: I0309 13:21:08.711269 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:08 crc kubenswrapper[4703]: E0309 13:21:08.716599 4703 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 13:21:09 crc kubenswrapper[4703]: I0309 13:21:09.178314 4703 csr.go:261] certificate signing request csr-jgnhd is approved, waiting to be issued Mar 09 13:21:09 crc kubenswrapper[4703]: I0309 13:21:09.186297 4703 csr.go:257] certificate signing request csr-jgnhd is issued Mar 09 13:21:09 crc kubenswrapper[4703]: I0309 13:21:09.283560 4703 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 13:21:09 crc kubenswrapper[4703]: I0309 13:21:09.490082 4703 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 13:21:10 crc kubenswrapper[4703]: I0309 13:21:10.187440 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-10 02:47:57.098476745 +0000 UTC Mar 09 13:21:10 crc kubenswrapper[4703]: I0309 13:21:10.188441 4703 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7357h26m46.910046401s for next certificate rotation Mar 09 13:21:14 crc kubenswrapper[4703]: E0309 13:21:14.792806 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.716947 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.718300 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.718351 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.718368 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.718503 4703 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.739721 4703 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.740377 4703 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.740408 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.744984 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.745055 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.745069 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.745111 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.745124 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:15Z","lastTransitionTime":"2026-03-09T13:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.759364 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.766360 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.766394 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.766407 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.766426 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.766440 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:15Z","lastTransitionTime":"2026-03-09T13:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.777464 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.784120 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.784141 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.784148 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.784162 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.784171 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:15Z","lastTransitionTime":"2026-03-09T13:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.792218 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.797988 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.798008 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.798016 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.798027 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:15 crc kubenswrapper[4703]: I0309 13:21:15.798035 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:15Z","lastTransitionTime":"2026-03-09T13:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.807672 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.807814 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.807859 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:15 crc kubenswrapper[4703]: E0309 13:21:15.908704 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.009647 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.109884 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.210921 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.312060 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.412199 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.512665 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.612905 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.713616 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.814582 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:16 crc kubenswrapper[4703]: E0309 13:21:16.915752 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.016600 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.117699 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.218002 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.318243 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.419297 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.519889 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.621093 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.722092 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.823350 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:17 crc kubenswrapper[4703]: E0309 13:21:17.924130 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.024585 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.125153 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.226107 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.327203 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.428255 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.529213 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.629775 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: I0309 13:21:18.706153 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:18 crc kubenswrapper[4703]: I0309 13:21:18.707736 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:18 crc kubenswrapper[4703]: I0309 13:21:18.707790 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:18 crc kubenswrapper[4703]: I0309 13:21:18.707813 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:18 crc kubenswrapper[4703]: I0309 13:21:18.708837 4703 scope.go:117] "RemoveContainer" containerID="ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.731193 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.831579 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:18 crc kubenswrapper[4703]: E0309 13:21:18.932024 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.032664 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.133667 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.233969 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.334713 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.435734 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: I0309 13:21:19.439531 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:19 crc kubenswrapper[4703]: I0309 13:21:19.441205 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565"} Mar 09 13:21:19 crc kubenswrapper[4703]: I0309 13:21:19.441361 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:19 crc kubenswrapper[4703]: I0309 13:21:19.442350 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:19 crc kubenswrapper[4703]: I0309 13:21:19.442409 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:19 crc kubenswrapper[4703]: I0309 13:21:19.442425 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.535991 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.636917 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.737582 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.838572 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:19 crc kubenswrapper[4703]: E0309 13:21:19.938936 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.039932 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.140926 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.213763 4703 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.241536 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.342472 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.442612 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.447140 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.449138 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.452747 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" exitCode=255 Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.452816 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565"} Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.452906 4703 scope.go:117] "RemoveContainer" containerID="ddbad63b119c26bd7a6079d9c2be19ae4089b1eabf6a3d27ce1ae328a7c61ce0" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.453181 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.454764 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.454821 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.454886 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:20 crc kubenswrapper[4703]: I0309 13:21:20.455818 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.456149 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.543459 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.644060 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.744221 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.844419 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:20 crc kubenswrapper[4703]: E0309 13:21:20.945550 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.046780 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.147925 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.248878 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.349670 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.450260 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: I0309 13:21:21.456755 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.550805 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.651489 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.752039 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.853198 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:21 crc kubenswrapper[4703]: E0309 13:21:21.953513 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.054070 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.154586 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.255004 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.355154 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.455338 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.555654 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.656825 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.757480 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.857966 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:22 crc kubenswrapper[4703]: E0309 13:21:22.959145 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.059782 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.160035 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.261001 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.362086 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.462234 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.562981 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.663292 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.763817 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.864026 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:23 crc kubenswrapper[4703]: E0309 13:21:23.965052 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.065971 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.166463 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.266777 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.367196 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.468102 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.569121 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.669465 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.770085 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.794054 4703 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.870803 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:24 crc kubenswrapper[4703]: E0309 13:21:24.970979 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.071393 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.172072 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.273321 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.373767 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.474625 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.575261 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.676409 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.777618 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.878649 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.917136 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.922797 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.922872 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.922884 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.922899 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.922912 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:25Z","lastTransitionTime":"2026-03-09T13:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.933572 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.937030 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.937087 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.937152 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.937177 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.937232 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:25Z","lastTransitionTime":"2026-03-09T13:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.949029 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.952772 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.952816 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.952831 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.952869 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.952885 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:25Z","lastTransitionTime":"2026-03-09T13:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.962131 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.965823 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.965883 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.965896 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.965912 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:25 crc kubenswrapper[4703]: I0309 13:21:25.965923 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:25Z","lastTransitionTime":"2026-03-09T13:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.976527 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.976637 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:21:25 crc kubenswrapper[4703]: E0309 13:21:25.979595 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.079964 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.180573 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.280719 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.381278 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.482270 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.582438 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.683132 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.783313 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.883981 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:26 crc kubenswrapper[4703]: E0309 13:21:26.985094 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.085328 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.186261 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.287189 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.388381 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.489003 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.589176 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.689765 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.790188 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.890706 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:27 crc kubenswrapper[4703]: E0309 13:21:27.991738 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.092913 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.193297 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.293462 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.394602 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.495590 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: I0309 13:21:28.560080 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:28 crc kubenswrapper[4703]: I0309 13:21:28.560463 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:28 crc kubenswrapper[4703]: I0309 13:21:28.562349 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:28 crc kubenswrapper[4703]: I0309 13:21:28.562391 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:28 crc kubenswrapper[4703]: I0309 13:21:28.562404 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:28 crc kubenswrapper[4703]: I0309 13:21:28.563249 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.563477 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.596879 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.697485 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.797956 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.899155 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:28 crc kubenswrapper[4703]: E0309 13:21:28.999384 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.100075 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.200525 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.301543 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.402727 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.503779 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.604466 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.705342 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.805948 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:29 crc kubenswrapper[4703]: I0309 13:21:29.839962 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:21:29 crc kubenswrapper[4703]: I0309 13:21:29.840178 4703 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 13:21:29 crc kubenswrapper[4703]: I0309 13:21:29.842228 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:29 crc kubenswrapper[4703]: I0309 13:21:29.842259 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:29 crc kubenswrapper[4703]: I0309 13:21:29.842271 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:29 crc kubenswrapper[4703]: I0309 13:21:29.842890 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.843067 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:29 crc kubenswrapper[4703]: E0309 13:21:29.906653 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.007819 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.108482 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.209122 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.310076 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.410983 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.511741 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.611955 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.712066 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.812837 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: E0309 13:21:30.913604 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:30 crc kubenswrapper[4703]: I0309 13:21:30.996674 4703 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.014430 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.115244 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.216191 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.317226 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.418324 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.518618 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.619228 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.720180 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.820326 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:31 crc kubenswrapper[4703]: E0309 13:21:31.921564 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.022027 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.122369 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.223461 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.323645 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.424741 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.525402 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.625699 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.731125 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.832109 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:32 crc kubenswrapper[4703]: E0309 13:21:32.933164 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.034196 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.135223 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.236105 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.336951 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.437882 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.538715 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: I0309 13:21:33.562653 4703 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.638958 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.739176 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.840111 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:33 crc kubenswrapper[4703]: E0309 13:21:33.941100 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.041568 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.142349 4703 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.197307 4703 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.244890 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.244941 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.244961 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.244987 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.245005 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.348152 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.348404 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.348434 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.348459 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.348482 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.450753 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.450808 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.450832 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.450889 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.450911 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.471588 4703 apiserver.go:52] "Watching apiserver" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.476908 4703 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.477201 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.478076 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.478173 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.478298 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.478333 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.478384 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.478394 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.478297 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.478526 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.478604 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.481326 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.481561 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.483013 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.483082 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.483129 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.483344 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.483604 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.484008 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.485362 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.517262 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.529179 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.535783 4703 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.547788 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.553246 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.553290 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.553301 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.553319 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.553337 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.563803 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.563872 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.563897 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.563919 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.563966 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.563988 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564081 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564104 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564129 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564176 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564199 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564243 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564268 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564290 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564335 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564359 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564405 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564431 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564475 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564498 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564519 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564565 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564587 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564610 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564653 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564675 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564718 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564743 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564770 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564810 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564831 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564882 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564904 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564947 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564976 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.564998 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565042 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565064 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565111 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565147 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565181 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565191 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565248 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565269 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565289 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565311 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565343 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565373 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565394 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565414 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565435 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565457 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565477 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565502 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565525 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565547 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565569 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565589 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565611 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565635 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565651 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565667 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565681 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565696 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565712 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565728 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565743 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565757 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565776 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565794 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565813 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565828 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565859 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565875 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565875 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565891 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565909 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565930 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565952 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565970 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.565974 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566019 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566013 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566045 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566005 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566073 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566098 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566180 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566190 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566204 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566203 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566222 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566241 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566263 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566285 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566304 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566324 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566344 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566365 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566372 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566386 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566495 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566527 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566558 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566586 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566612 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566639 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566665 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566691 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566717 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566746 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566771 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566799 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566827 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566875 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566901 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566925 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566952 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566975 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567000 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567028 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567052 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567075 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567097 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567118 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567139 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567286 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567316 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567341 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567366 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567388 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567411 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567433 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567460 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567651 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567677 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567701 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567727 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567751 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567774 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567800 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567824 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567867 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567894 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567928 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567954 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567978 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568000 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568024 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568051 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568072 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568096 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568126 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568153 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568179 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568206 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568229 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568253 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568278 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568304 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568327 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568352 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568375 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568399 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568424 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568450 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568473 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568670 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568692 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568715 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568741 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568769 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568793 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568817 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568860 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568886 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568913 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568942 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568966 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568991 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569018 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569044 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569067 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569093 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569118 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569141 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569166 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569189 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569212 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569237 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569263 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569287 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569315 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569343 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569368 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569396 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569427 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569452 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569476 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569499 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569525 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569550 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569573 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570218 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570264 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570297 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570323 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570355 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570395 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570428 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570455 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570480 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570510 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570538 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570563 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570588 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570617 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570701 4703 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570719 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570735 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570748 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570762 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570775 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570788 4703 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570801 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566395 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566492 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.566754 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567146 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567315 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567860 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.567870 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568144 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568195 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568412 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568467 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.568720 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569215 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569311 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569355 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569659 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569670 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569694 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.569835 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570269 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570295 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.570901 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.572183 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.572333 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.572956 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.572968 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.573231 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.573684 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.574195 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.574474 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.574520 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.574936 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.575221 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.575684 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.575917 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.576296 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.576881 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.576889 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.577295 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.577649 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.577695 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.578189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.578285 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.578449 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.578529 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:21:35.078509188 +0000 UTC m=+91.045924874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.579172 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.579828 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.579905 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.578831 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.579939 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.580246 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.580259 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.580276 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.580370 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.580441 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.580552 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.580593 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.581122 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.581396 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.581453 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.581688 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.581763 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.581927 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.582189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.582223 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.582560 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.582578 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.582647 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.582776 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.583181 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.583216 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.584698 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.584153 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.584834 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.584833 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585212 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585306 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585341 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585587 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585598 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585748 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585787 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.586036 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.586348 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.586370 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.586596 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.586757 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.586905 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.585396 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.587784 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.588173 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.583496 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.588359 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.588460 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.588727 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.588753 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.589064 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.589109 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.589120 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.589489 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.589667 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.590200 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.590277 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.590401 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.590785 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.578531 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.591017 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.591155 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.591357 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.591403 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.591405 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.592393 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.593265 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.593462 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.593806 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.593883 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.594100 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.594204 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.594339 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.594827 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.595766 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.595833 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596015 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596058 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.598223 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.598428 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.598527 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596117 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596298 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596455 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.598946 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596469 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596605 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596609 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596631 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596923 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596929 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.596957 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.597246 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.597756 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.597768 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.597810 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.597936 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.599475 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.599552 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.597156 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.598396 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.600592 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.600768 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.601236 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.599839 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.601280 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.601326 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.601325 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.599039 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602135 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602172 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602959 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602965 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602406 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602483 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602662 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602887 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.602919 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.603134 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.603170 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.603190 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.603318 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:35.103283775 +0000 UTC m=+91.070699501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.603729 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.603784 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.603899 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604063 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604091 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604187 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604350 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604424 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604437 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604473 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.604738 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.605040 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.605193 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.605421 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.600538 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.605547 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.605072 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.606118 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.606257 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:35.10622261 +0000 UTC m=+91.073638366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.606471 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.603145 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.607327 4703 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.607734 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.607808 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:35.107793505 +0000 UTC m=+91.075209191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.611621 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.612591 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.612601 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.613597 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.614323 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.614995 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.615014 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.615030 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:34 crc kubenswrapper[4703]: E0309 13:21:34.615077 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:35.115061665 +0000 UTC m=+91.082477361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.615962 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.617006 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.623803 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.625531 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.629651 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.630650 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.633067 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.633366 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.640189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.641558 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.653646 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.656584 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.656672 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.656692 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.656715 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.656740 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.659900 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.667450 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671195 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671258 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671326 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671340 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671353 4703 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671364 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671376 4703 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671387 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671400 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671411 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671424 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671435 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671446 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671457 4703 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671468 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671481 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671494 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671506 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671517 4703 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671528 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671539 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671552 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671452 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671562 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671626 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671641 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671658 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671672 4703 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671685 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671697 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671710 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671722 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671736 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671748 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671762 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671774 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671786 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671799 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671811 4703 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671824 4703 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671837 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671867 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671883 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671898 4703 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671910 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671924 4703 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671937 4703 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671951 4703 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671963 4703 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671974 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671986 4703 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671998 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672011 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672023 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.671477 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672036 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672051 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672093 4703 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672118 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672135 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672152 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672166 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672181 4703 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672194 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672206 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672218 4703 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672230 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672243 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672255 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672266 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672278 4703 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672291 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672304 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672316 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672329 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672340 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672353 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672366 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672378 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672392 4703 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672403 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672415 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672427 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672440 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672453 4703 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672464 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672476 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672488 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672501 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672521 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672533 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672545 4703 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672557 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672571 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672582 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672594 4703 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672606 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672618 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672630 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672642 4703 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672653 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672666 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672678 4703 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672690 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672704 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672717 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672730 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672742 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672754 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672766 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672778 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672790 4703 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672802 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672814 4703 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672826 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672839 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672868 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672880 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672892 4703 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672905 4703 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672917 4703 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672930 4703 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672942 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672954 4703 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672966 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672979 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.672990 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673002 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673014 4703 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673025 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673037 4703 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673049 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673061 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673073 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673086 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673098 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673111 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673124 4703 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673136 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673148 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673161 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673173 4703 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673184 4703 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673197 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673212 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673224 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673237 4703 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673249 4703 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673261 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673273 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673285 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673298 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673311 4703 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673322 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673334 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673346 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673358 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673370 4703 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673381 4703 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673393 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673405 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673416 4703 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673428 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673441 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673453 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673464 4703 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673475 4703 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673487 4703 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673498 4703 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673510 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673522 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673535 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673547 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673558 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673570 4703 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673582 4703 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673596 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673608 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673621 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673632 4703 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673645 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673657 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673669 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673682 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673693 4703 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673706 4703 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673717 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673729 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673741 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.673753 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.678548 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.689416 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.699550 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.712960 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.713586 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.715041 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.715914 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.716797 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.717051 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.717678 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.718383 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.719510 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.720284 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.721384 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.722043 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.723296 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.723992 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.724613 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.725670 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.726371 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.727378 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.727507 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.727991 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.728683 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.729872 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.730419 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.731717 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.732259 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.733476 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.734037 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.734761 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.736209 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.736357 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.736772 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.737926 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.738483 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.739513 4703 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.739636 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.741658 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.742751 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.743274 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.745089 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.745900 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.745941 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.746966 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.747781 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.749028 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.749570 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.750715 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.751508 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.752663 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.753238 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.754356 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.755087 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.756068 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.756401 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.757001 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.758192 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.759113 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.760201 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.760906 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.761463 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.762972 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.763000 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.763011 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.763024 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.763035 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.768271 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.803965 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.820527 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.830701 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 13:21:34 crc kubenswrapper[4703]: W0309 13:21:34.844206 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-22f08f461ef2d9c1d30109e61d591a12a5cb578b55de78e31a84a85bbb103848 WatchSource:0}: Error finding container 22f08f461ef2d9c1d30109e61d591a12a5cb578b55de78e31a84a85bbb103848: Status 404 returned error can't find the container with id 22f08f461ef2d9c1d30109e61d591a12a5cb578b55de78e31a84a85bbb103848 Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.866078 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.866415 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.866431 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.866450 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.866462 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.968509 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.968562 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.968575 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.968591 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:34 crc kubenswrapper[4703]: I0309 13:21:34.968622 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:34Z","lastTransitionTime":"2026-03-09T13:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.070893 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.070930 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.070938 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.070952 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.070962 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.078874 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.078974 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:21:36.078954643 +0000 UTC m=+92.046370339 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.173979 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.174037 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.174048 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.174065 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.174080 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.180496 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.180613 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.180710 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180727 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.180746 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180790 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180819 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:36.180778908 +0000 UTC m=+92.148194594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180829 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180867 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180928 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:36.180909622 +0000 UTC m=+92.148325318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180927 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180962 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.180977 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.181040 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:36.181019975 +0000 UTC m=+92.148435671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.181127 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.181165 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:36.181154879 +0000 UTC m=+92.148570575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.277575 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.277637 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.277651 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.277681 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.277697 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.380026 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.380062 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.380071 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.380084 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.380092 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.482325 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.482385 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.482408 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.482436 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.482472 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.502621 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.502697 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"22f08f461ef2d9c1d30109e61d591a12a5cb578b55de78e31a84a85bbb103848"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.503827 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9e0913b392a636e7ed7f04effe7efa4080187e2cb5c066fdb30544f31996b409"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.505633 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.505690 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.505711 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"31fe016609ef6d6998f1323eccad52848d21ea690f09618e25715d29108fd3a6"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.521136 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.536368 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.551568 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.564701 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.576473 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.585367 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.585404 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.585415 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.585431 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.585443 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.590039 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.600811 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.614929 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.633202 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.646662 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.670669 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.687780 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.687828 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.687865 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.687885 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.687897 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.704605 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:35Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.706788 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.706883 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.706917 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.706942 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.706999 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:35 crc kubenswrapper[4703]: E0309 13:21:35.707064 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.720904 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.790371 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.790410 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.790420 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.790434 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.790444 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.891961 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.892002 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.892012 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.892026 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.892038 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.994484 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.994521 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.994531 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.994568 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:35 crc kubenswrapper[4703]: I0309 13:21:35.994581 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:35Z","lastTransitionTime":"2026-03-09T13:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.087935 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.087943 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:21:38.087924156 +0000 UTC m=+94.055339842 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.096395 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.096421 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.096432 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.096446 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.096457 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.189089 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.189154 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.189202 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.189245 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189272 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189393 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189406 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:38.189380391 +0000 UTC m=+94.156796077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189403 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189507 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189408 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189522 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189541 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189514 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:38.189486164 +0000 UTC m=+94.156902010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189561 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189581 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:38.189561726 +0000 UTC m=+94.156977432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.189625 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:38.189609298 +0000 UTC m=+94.157025144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.199466 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.199509 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.199523 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.199540 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.199574 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.302283 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.302319 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.302329 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.302343 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.302355 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.318371 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.318400 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.318408 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.318420 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.318428 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.330653 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.334245 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.334291 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.334301 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.334319 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.334331 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.348415 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.353751 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.353790 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.353807 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.353826 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.353860 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.366053 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.369305 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.369352 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.369364 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.369380 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.369392 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.388733 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.392527 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.392561 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.392573 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.392587 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.392595 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.402647 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:36Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:36 crc kubenswrapper[4703]: E0309 13:21:36.402789 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.404224 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.404275 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.404291 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.404313 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.404331 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.506332 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.506385 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.506400 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.506416 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.506428 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.608459 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.608497 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.608505 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.608519 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.608528 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.710555 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.710591 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.710601 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.710613 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.710623 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.813014 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.813130 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.813148 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.813171 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.813189 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.915585 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.915609 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.915617 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.915630 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:36 crc kubenswrapper[4703]: I0309 13:21:36.915638 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:36Z","lastTransitionTime":"2026-03-09T13:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.017623 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.017673 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.017685 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.017701 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.017713 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.120527 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.120592 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.120610 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.121041 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.121100 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.223468 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.223517 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.223532 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.223551 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.223565 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.326778 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.327154 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.327338 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.327532 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.327668 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.429370 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.429402 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.429412 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.429426 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.429436 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.510828 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.531829 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.531889 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.531900 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.531915 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.531926 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.532539 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.549997 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.564598 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.578151 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.589117 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.601778 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.614856 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:37Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.633640 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.633685 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.633695 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.633712 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.633724 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.705858 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:37 crc kubenswrapper[4703]: E0309 13:21:37.705994 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.706024 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.706092 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:37 crc kubenswrapper[4703]: E0309 13:21:37.706210 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:37 crc kubenswrapper[4703]: E0309 13:21:37.706333 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.736166 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.736220 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.736235 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.736255 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.736270 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.838654 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.838708 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.838719 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.838747 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.838761 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.941556 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.941594 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.941604 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.941620 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:37 crc kubenswrapper[4703]: I0309 13:21:37.941631 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:37Z","lastTransitionTime":"2026-03-09T13:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.044029 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.044065 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.044076 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.044092 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.044104 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.104885 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.105176 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:21:42.10514456 +0000 UTC m=+98.072560286 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.146199 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.146230 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.146238 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.146251 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.146261 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.206165 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.206208 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.206234 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.206255 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206373 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206412 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206379 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206464 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206475 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206449 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:42.2064307 +0000 UTC m=+98.173846386 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206507 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206554 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206569 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206529 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:42.206509512 +0000 UTC m=+98.173925278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206786 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:42.206705928 +0000 UTC m=+98.174121664 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.206874 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:42.206827652 +0000 UTC m=+98.174243378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.248728 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.248792 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.248809 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.248832 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.248872 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.352365 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.352429 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.352451 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.352480 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.352501 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.455526 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.455591 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.455609 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.455635 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.455688 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.559097 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.559147 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.559166 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.559189 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.559207 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.662288 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.662343 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.662363 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.662394 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.662414 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.766462 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.766511 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.766533 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.766555 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.766572 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.870920 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.870978 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.870996 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.871022 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.871042 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.925664 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:38 crc kubenswrapper[4703]: E0309 13:21:38.925838 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.973573 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.973615 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.973626 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.973639 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:38 crc kubenswrapper[4703]: I0309 13:21:38.973650 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:38Z","lastTransitionTime":"2026-03-09T13:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.075597 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.075626 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.075659 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.075673 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.075681 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.158044 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-q5n88"] Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.158516 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.159715 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pmzvj"] Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.160086 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.164716 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.165254 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.165335 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.165637 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.165732 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.165819 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.166593 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.166631 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.171333 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9khwq"] Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.172236 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-n9x5k"] Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.172409 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rwff8"] Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.173094 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.173285 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.174754 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.177768 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.178005 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.178147 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.178485 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.179095 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.179349 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.179948 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.180424 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.180595 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.180829 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.180909 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.181692 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.183015 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.183057 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.183106 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.183132 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.183143 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.183251 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.184643 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.185274 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216633 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-netns\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216697 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-cnibin\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216721 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-cni-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216756 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-cni-bin\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216776 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-kubelet\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216796 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9psl\" (UniqueName: \"kubernetes.io/projected/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-kube-api-access-x9psl\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216819 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-rootfs\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216837 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-hostroot\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216874 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-daemon-config\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216893 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-config\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216916 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-script-lib\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216937 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-var-lib-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216960 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-socket-dir-parent\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.216983 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-log-socket\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217002 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4fbab78b-1484-4244-8d11-ec4f47b43718-cni-binary-copy\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217034 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-cnibin\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217053 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-cni-multus\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217076 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-etc-kubernetes\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217096 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217117 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-kubelet\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217140 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217172 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-proxy-tls\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217194 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4fbab78b-1484-4244-8d11-ec4f47b43718-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217213 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-system-cni-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217234 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-k8s-cni-cncf-io\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217256 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-netns\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217276 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovn-node-metrics-cert\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217295 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-system-cni-dir\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217317 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2311ba56-bb75-4876-ad86-6c74012001ae-hosts-file\") pod \"node-resolver-q5n88\" (UID: \"2311ba56-bb75-4876-ad86-6c74012001ae\") " pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217336 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-multus-certs\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217357 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-node-log\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217376 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkddz\" (UniqueName: \"kubernetes.io/projected/4fbab78b-1484-4244-8d11-ec4f47b43718-kube-api-access-tkddz\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217397 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxp9\" (UniqueName: \"kubernetes.io/projected/2311ba56-bb75-4876-ad86-6c74012001ae-kube-api-access-sbxp9\") pod \"node-resolver-q5n88\" (UID: \"2311ba56-bb75-4876-ad86-6c74012001ae\") " pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217416 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d59f2278-9dbc-48bb-8d56-fa9da4183118-cni-binary-copy\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217436 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-conf-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217455 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-etc-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217475 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqtr\" (UniqueName: \"kubernetes.io/projected/d59f2278-9dbc-48bb-8d56-fa9da4183118-kube-api-access-swqtr\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217497 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-systemd-units\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217519 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2rc\" (UniqueName: \"kubernetes.io/projected/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-kube-api-access-4z2rc\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217539 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217560 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-os-release\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217606 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-os-release\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217628 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-env-overrides\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217648 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217673 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-ovn\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217693 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-ovn-kubernetes\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217715 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-slash\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217736 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-systemd\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217756 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-bin\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.217774 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-netd\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.223885 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.239548 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.256871 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.268640 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.279398 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.285744 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.285791 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.285803 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.285821 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.285836 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.289719 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.297616 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.308732 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318694 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-cni-bin\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318733 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-kubelet\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318757 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9psl\" (UniqueName: \"kubernetes.io/projected/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-kube-api-access-x9psl\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318781 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-rootfs\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318802 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-hostroot\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318825 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-daemon-config\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318867 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-config\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318890 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-script-lib\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318914 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-var-lib-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318929 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-kubelet\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318968 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-hostroot\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319007 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-socket-dir-parent\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319006 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-rootfs\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318937 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-socket-dir-parent\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319065 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-log-socket\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319089 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4fbab78b-1484-4244-8d11-ec4f47b43718-cni-binary-copy\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319110 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-cnibin\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319133 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-cni-multus\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319158 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-etc-kubernetes\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319180 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319201 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-kubelet\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319224 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319259 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-proxy-tls\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319285 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4fbab78b-1484-4244-8d11-ec4f47b43718-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319309 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-system-cni-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319331 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-k8s-cni-cncf-io\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319353 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-netns\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319377 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovn-node-metrics-cert\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319400 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-system-cni-dir\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319423 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2311ba56-bb75-4876-ad86-6c74012001ae-hosts-file\") pod \"node-resolver-q5n88\" (UID: \"2311ba56-bb75-4876-ad86-6c74012001ae\") " pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319446 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-multus-certs\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319469 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-node-log\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319491 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkddz\" (UniqueName: \"kubernetes.io/projected/4fbab78b-1484-4244-8d11-ec4f47b43718-kube-api-access-tkddz\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319514 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxp9\" (UniqueName: \"kubernetes.io/projected/2311ba56-bb75-4876-ad86-6c74012001ae-kube-api-access-sbxp9\") pod \"node-resolver-q5n88\" (UID: \"2311ba56-bb75-4876-ad86-6c74012001ae\") " pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319536 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d59f2278-9dbc-48bb-8d56-fa9da4183118-cni-binary-copy\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319557 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-conf-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319594 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-etc-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319620 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqtr\" (UniqueName: \"kubernetes.io/projected/d59f2278-9dbc-48bb-8d56-fa9da4183118-kube-api-access-swqtr\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319641 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-systemd-units\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319665 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2rc\" (UniqueName: \"kubernetes.io/projected/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-kube-api-access-4z2rc\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319691 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319715 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-os-release\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319722 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-netns\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319764 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-os-release\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319783 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-env-overrides\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319785 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-os-release\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319801 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319818 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-ovn\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319823 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-config\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319836 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-ovn-kubernetes\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319883 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-ovn-kubernetes\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319903 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-slash\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319927 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-log-socket\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319933 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-systemd\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319957 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-bin\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319978 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-netd\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320004 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-netns\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320047 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-cnibin\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320071 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-cni-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320305 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-cni-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320323 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-system-cni-dir\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320344 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-slash\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319910 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-var-lib-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320397 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2311ba56-bb75-4876-ad86-6c74012001ae-hosts-file\") pod \"node-resolver-q5n88\" (UID: \"2311ba56-bb75-4876-ad86-6c74012001ae\") " pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320399 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-multus-certs\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320429 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-systemd\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320447 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-node-log\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320459 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-bin\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320472 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4fbab78b-1484-4244-8d11-ec4f47b43718-cni-binary-copy\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320493 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-cnibin\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320502 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-netns\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320521 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-cnibin\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320500 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-daemon-config\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320558 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-system-cni-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320560 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320529 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-netd\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320585 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-run-k8s-cni-cncf-io\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320619 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-cni-multus\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320663 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-etc-openvswitch\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320664 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-etc-kubernetes\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320707 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-systemd-units\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320721 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-os-release\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320721 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.320752 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-kubelet\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.319690 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-script-lib\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.318868 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-host-var-lib-cni-bin\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.321087 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4fbab78b-1484-4244-8d11-ec4f47b43718-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.321125 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d59f2278-9dbc-48bb-8d56-fa9da4183118-cni-binary-copy\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.321132 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-ovn\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.321161 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d59f2278-9dbc-48bb-8d56-fa9da4183118-multus-conf-dir\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.321430 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-env-overrides\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.321658 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.321659 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4fbab78b-1484-4244-8d11-ec4f47b43718-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.324814 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.326453 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-proxy-tls\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.336262 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqtr\" (UniqueName: \"kubernetes.io/projected/d59f2278-9dbc-48bb-8d56-fa9da4183118-kube-api-access-swqtr\") pod \"multus-n9x5k\" (UID: \"d59f2278-9dbc-48bb-8d56-fa9da4183118\") " pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.336390 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovn-node-metrics-cert\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.338529 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9psl\" (UniqueName: \"kubernetes.io/projected/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-kube-api-access-x9psl\") pod \"ovnkube-node-9khwq\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.338996 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxp9\" (UniqueName: \"kubernetes.io/projected/2311ba56-bb75-4876-ad86-6c74012001ae-kube-api-access-sbxp9\") pod \"node-resolver-q5n88\" (UID: \"2311ba56-bb75-4876-ad86-6c74012001ae\") " pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.341469 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.346381 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkddz\" (UniqueName: \"kubernetes.io/projected/4fbab78b-1484-4244-8d11-ec4f47b43718-kube-api-access-tkddz\") pod \"multus-additional-cni-plugins-rwff8\" (UID: \"4fbab78b-1484-4244-8d11-ec4f47b43718\") " pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.355948 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.356260 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2rc\" (UniqueName: \"kubernetes.io/projected/4316a119-ceb8-44c1-a4ad-2d64ca0c0f29-kube-api-access-4z2rc\") pod \"machine-config-daemon-pmzvj\" (UID: \"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\") " pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.367013 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.382194 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.388280 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.388313 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.388321 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.388335 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.388347 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.395150 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.403499 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.420007 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.430877 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.439489 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.450174 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:39Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.482881 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q5n88" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.490071 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.490114 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.490125 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.490143 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.490157 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.491233 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rwff8" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.498746 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:21:39 crc kubenswrapper[4703]: W0309 13:21:39.504206 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fbab78b_1484_4244_8d11_ec4f47b43718.slice/crio-dabacdf05318c4e3b8efdc0769ded7ecdc6e16c76927b64b9c4a15d4f8944cd7 WatchSource:0}: Error finding container dabacdf05318c4e3b8efdc0769ded7ecdc6e16c76927b64b9c4a15d4f8944cd7: Status 404 returned error can't find the container with id dabacdf05318c4e3b8efdc0769ded7ecdc6e16c76927b64b9c4a15d4f8944cd7 Mar 09 13:21:39 crc kubenswrapper[4703]: W0309 13:21:39.509970 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4316a119_ceb8_44c1_a4ad_2d64ca0c0f29.slice/crio-3e09b45107ac7a5716349567e880914d07572df0a06546867b8945a4e583c19f WatchSource:0}: Error finding container 3e09b45107ac7a5716349567e880914d07572df0a06546867b8945a4e583c19f: Status 404 returned error can't find the container with id 3e09b45107ac7a5716349567e880914d07572df0a06546867b8945a4e583c19f Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.518012 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"3e09b45107ac7a5716349567e880914d07572df0a06546867b8945a4e583c19f"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.522866 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n9x5k" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.522943 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerStarted","Data":"dabacdf05318c4e3b8efdc0769ded7ecdc6e16c76927b64b9c4a15d4f8944cd7"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.524750 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q5n88" event={"ID":"2311ba56-bb75-4876-ad86-6c74012001ae","Type":"ContainerStarted","Data":"dae48ebe24a3511afe2fc38396df5dfdcdf914ac12d9be48db3a940cdf56b246"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.525912 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:39 crc kubenswrapper[4703]: W0309 13:21:39.551239 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd59f2278_9dbc_48bb_8d56_fa9da4183118.slice/crio-fa3fc4f1dea4a96393502ffc6267f90607b3f25899f7a7614114208e179bbb8d WatchSource:0}: Error finding container fa3fc4f1dea4a96393502ffc6267f90607b3f25899f7a7614114208e179bbb8d: Status 404 returned error can't find the container with id fa3fc4f1dea4a96393502ffc6267f90607b3f25899f7a7614114208e179bbb8d Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.592568 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.592603 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.592611 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.592625 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.592634 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.694621 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.695013 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.695025 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.695041 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.695054 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.706201 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.706262 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:39 crc kubenswrapper[4703]: E0309 13:21:39.706305 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:39 crc kubenswrapper[4703]: E0309 13:21:39.706396 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.798941 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.798982 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.798994 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.799011 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.799023 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.902053 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.902107 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.902125 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.902152 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:39 crc kubenswrapper[4703]: I0309 13:21:39.902168 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:39Z","lastTransitionTime":"2026-03-09T13:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.004429 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.004471 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.004482 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.004500 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.004511 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.107491 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.107541 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.107555 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.107575 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.107589 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.211699 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.211742 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.211758 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.211782 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.211797 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.314370 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.314407 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.314419 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.314433 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.314445 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.416351 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.416387 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.416397 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.416413 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.416423 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.518782 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.518822 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.518833 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.518861 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.518873 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.530107 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fbab78b-1484-4244-8d11-ec4f47b43718" containerID="2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f" exitCode=0 Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.530214 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerDied","Data":"2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.531939 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6" exitCode=0 Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.532043 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.532117 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"b3d382ee9222a9fc707c99ad08b0d8e03b192284f998f3c8b8d69be6b1142445"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.533728 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerStarted","Data":"6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.533764 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerStarted","Data":"fa3fc4f1dea4a96393502ffc6267f90607b3f25899f7a7614114208e179bbb8d"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.535355 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q5n88" event={"ID":"2311ba56-bb75-4876-ad86-6c74012001ae","Type":"ContainerStarted","Data":"b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.538171 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.538208 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.548213 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.576496 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.591365 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.600499 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.612113 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.621826 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.621868 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.621878 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.621891 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.621901 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.623725 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.635638 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.647315 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.660230 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.671590 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.684233 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.694737 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.704741 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.706268 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:40 crc kubenswrapper[4703]: E0309 13:21:40.706515 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.718448 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.718917 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.724660 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.724718 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.724733 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.724755 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.724769 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.729178 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.748012 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.758611 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.772745 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.787278 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.798080 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.812137 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.826666 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.827158 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.827210 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.827222 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.827242 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.827255 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.839344 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.853765 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:40Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.929309 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.929342 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.929351 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.929364 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:40 crc kubenswrapper[4703]: I0309 13:21:40.929373 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:40Z","lastTransitionTime":"2026-03-09T13:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.031785 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.031824 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.031834 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.031871 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.031886 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.133988 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.134107 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.134117 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.134133 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.134143 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.238066 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.238107 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.238117 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.238131 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.238139 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.340463 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.340518 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.340536 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.340556 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.340569 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.442546 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.442596 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.442605 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.442620 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.442629 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.541627 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fbab78b-1484-4244-8d11-ec4f47b43718" containerID="99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37" exitCode=0 Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.541732 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerDied","Data":"99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.546241 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.546281 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.546291 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.546305 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.546320 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.550362 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.550402 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.550414 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.550421 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.550429 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.550436 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.557034 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.576592 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.591538 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.602542 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.616968 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.629579 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.645233 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.649038 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.649063 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.649076 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.649092 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.649104 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.657887 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.669445 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.690894 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.703562 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.706204 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:41 crc kubenswrapper[4703]: E0309 13:21:41.706295 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.706471 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:41 crc kubenswrapper[4703]: E0309 13:21:41.706549 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.713401 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.718490 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.719409 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:21:41 crc kubenswrapper[4703]: E0309 13:21:41.719590 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.735652 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:41Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.751365 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.751395 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.751403 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.751415 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.751424 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.853879 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.853906 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.853914 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.853926 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.853934 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.956227 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.956276 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.956285 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.956298 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:41 crc kubenswrapper[4703]: I0309 13:21:41.956308 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:41Z","lastTransitionTime":"2026-03-09T13:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.059159 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.059240 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.059263 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.059297 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.059318 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.148631 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.148810 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:21:50.148783377 +0000 UTC m=+106.116199093 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.162600 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.162696 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.162720 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.162745 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.162765 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.250687 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.250944 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.251303 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:50.251263511 +0000 UTC m=+106.218679247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.252437 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.252482 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.252501 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.252607 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:50.25258908 +0000 UTC m=+106.220004796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.252230 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.252702 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.252804 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.252878 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.252973 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:50.25294501 +0000 UTC m=+106.220360686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.252999 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.253051 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.253067 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.253133 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:50.253118815 +0000 UTC m=+106.220534531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.266098 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.266135 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.266146 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.266165 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.266177 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.369144 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.369178 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.369189 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.369203 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.369214 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.471686 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.471724 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.471736 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.471750 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.471758 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.568143 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fbab78b-1484-4244-8d11-ec4f47b43718" containerID="4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce" exitCode=0 Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.568635 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.568818 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.569349 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerDied","Data":"4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.573484 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.573504 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.573511 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.573521 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.573530 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.583440 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.598734 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.610859 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.621419 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.629123 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.656838 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.667776 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.675903 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.675938 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.675950 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.675966 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.675976 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.680108 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.691141 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.705234 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.706426 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:42 crc kubenswrapper[4703]: E0309 13:21:42.706540 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.724931 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.741887 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.761948 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.773564 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:42Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.778426 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.778461 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.778473 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.778493 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.778505 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.881872 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.881916 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.881927 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.881945 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.881960 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.984525 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.984587 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.984602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.984618 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:42 crc kubenswrapper[4703]: I0309 13:21:42.984629 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:42Z","lastTransitionTime":"2026-03-09T13:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.087339 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.087437 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.087451 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.087467 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.087477 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.192543 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.192584 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.192593 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.192606 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.192618 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.294453 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.294485 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.294495 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.294508 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.294518 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.396986 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.397012 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.397021 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.397034 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.397043 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.556519 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.556561 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.556569 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.556581 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.556590 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.573254 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fbab78b-1484-4244-8d11-ec4f47b43718" containerID="9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63" exitCode=0 Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.573313 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerDied","Data":"9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.586088 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.600799 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.614060 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.629461 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.643073 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.656736 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.658968 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.659011 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.659022 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.659038 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.659051 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.668424 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.683297 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.695075 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.702552 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.705873 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.706047 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:43 crc kubenswrapper[4703]: E0309 13:21:43.706155 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:43 crc kubenswrapper[4703]: E0309 13:21:43.706283 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.720982 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.740414 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.751698 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.761742 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.761785 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.761797 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.761815 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.761827 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.770377 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:43Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.863700 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.863739 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.863749 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.863764 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.863772 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.969771 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.969801 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.969810 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.969825 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:43 crc kubenswrapper[4703]: I0309 13:21:43.969835 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:43Z","lastTransitionTime":"2026-03-09T13:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.072779 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.072884 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.072919 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.072960 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.072984 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.176602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.176683 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.176709 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.176740 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.176762 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.279191 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.279218 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.279230 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.279246 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.279257 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.385643 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.385678 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.385688 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.385705 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.385716 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.487475 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.487516 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.487531 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.487549 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.487561 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.578687 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fbab78b-1484-4244-8d11-ec4f47b43718" containerID="75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b" exitCode=0 Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.578780 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerDied","Data":"75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.584955 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.593613 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.593661 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.593676 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.593696 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.593714 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.594259 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.613500 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.627594 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.640412 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.663894 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.696235 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.696270 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.696278 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.696291 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.696300 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.696821 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.711081 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:44 crc kubenswrapper[4703]: E0309 13:21:44.711200 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.715345 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.728279 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.738144 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.745919 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.764169 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.782880 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.792429 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.798233 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.798259 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.798270 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.798285 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.798298 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.804736 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.815401 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.834519 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.847032 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.859029 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.869527 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.888150 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.896558 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.900419 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.900445 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.900453 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.900466 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.900474 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:44Z","lastTransitionTime":"2026-03-09T13:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.907384 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.918040 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.927386 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.941546 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.952994 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.963992 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:44 crc kubenswrapper[4703]: I0309 13:21:44.975582 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:44Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.003260 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.003297 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.003307 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.003323 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.003334 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.106452 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.106530 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.106565 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.106594 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.106617 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.209925 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.209981 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.209999 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.210023 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.210040 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.312420 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.312487 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.312511 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.312541 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.312563 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.415720 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.415767 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.415779 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.415797 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.415809 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.518300 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.518338 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.518346 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.518359 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.518370 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.590901 4703 generic.go:334] "Generic (PLEG): container finished" podID="4fbab78b-1484-4244-8d11-ec4f47b43718" containerID="dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5" exitCode=0 Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.590994 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerDied","Data":"dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.605552 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.624357 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.624670 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.624705 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.624715 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.624732 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.624744 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.639003 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.654684 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.666722 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.676583 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.690417 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.705218 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.706778 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.706790 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:45 crc kubenswrapper[4703]: E0309 13:21:45.706953 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:45 crc kubenswrapper[4703]: E0309 13:21:45.707143 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.717908 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.726710 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.726731 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.726741 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.726754 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.726763 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.735550 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.755110 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.772128 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.801208 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.814477 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.829503 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.829550 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.829562 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.829577 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.829586 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.865682 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-2gp76"] Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.866090 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.867493 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.867836 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.868081 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.868178 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.878647 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.891277 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.904126 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186829e9-3995-46ff-807c-06bf53e9a4e0-host\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.904172 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28tts\" (UniqueName: \"kubernetes.io/projected/186829e9-3995-46ff-807c-06bf53e9a4e0-kube-api-access-28tts\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.904191 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/186829e9-3995-46ff-807c-06bf53e9a4e0-serviceca\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.906620 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.919010 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.931078 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.931111 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.931122 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.931137 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.931147 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:45Z","lastTransitionTime":"2026-03-09T13:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.935696 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.950817 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.965296 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:45 crc kubenswrapper[4703]: I0309 13:21:45.992081 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:45Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.005074 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/186829e9-3995-46ff-807c-06bf53e9a4e0-serviceca\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.005154 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186829e9-3995-46ff-807c-06bf53e9a4e0-host\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.005197 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28tts\" (UniqueName: \"kubernetes.io/projected/186829e9-3995-46ff-807c-06bf53e9a4e0-kube-api-access-28tts\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.005267 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/186829e9-3995-46ff-807c-06bf53e9a4e0-host\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.006533 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.008286 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/186829e9-3995-46ff-807c-06bf53e9a4e0-serviceca\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.020713 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.024523 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28tts\" (UniqueName: \"kubernetes.io/projected/186829e9-3995-46ff-807c-06bf53e9a4e0-kube-api-access-28tts\") pod \"node-ca-2gp76\" (UID: \"186829e9-3995-46ff-807c-06bf53e9a4e0\") " pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.034086 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.034131 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.034148 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.034169 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.034184 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.036495 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.053549 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.063988 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.077056 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.086036 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.136175 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.136213 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.136223 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.136234 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.136242 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.199156 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2gp76" Mar 09 13:21:46 crc kubenswrapper[4703]: W0309 13:21:46.213708 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186829e9_3995_46ff_807c_06bf53e9a4e0.slice/crio-4e00985ec6b6406b38778d85c73f7b3fae3da7a15368891bef29244161e40f84 WatchSource:0}: Error finding container 4e00985ec6b6406b38778d85c73f7b3fae3da7a15368891bef29244161e40f84: Status 404 returned error can't find the container with id 4e00985ec6b6406b38778d85c73f7b3fae3da7a15368891bef29244161e40f84 Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.238972 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.239017 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.239029 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.239046 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.239058 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.341385 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.341431 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.341444 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.341463 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.341477 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.445327 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.445395 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.445416 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.445443 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.445462 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.547784 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.547832 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.547888 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.547907 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.547918 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.600558 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.600830 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.600864 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.602077 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2gp76" event={"ID":"186829e9-3995-46ff-807c-06bf53e9a4e0","Type":"ContainerStarted","Data":"4e00985ec6b6406b38778d85c73f7b3fae3da7a15368891bef29244161e40f84"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.622242 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.630082 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" event={"ID":"4fbab78b-1484-4244-8d11-ec4f47b43718","Type":"ContainerStarted","Data":"9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.635095 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.638566 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.645805 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.647195 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.655232 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.655288 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.655300 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.655319 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.655336 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.662322 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.662350 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.662358 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.662371 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.662379 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.665439 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: E0309 13:21:46.674150 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.676992 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.677030 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.677041 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.677064 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.677081 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.679156 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: E0309 13:21:46.687636 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.692445 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.692480 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.692488 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.692503 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.692513 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.693879 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: E0309 13:21:46.703328 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.706536 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:46 crc kubenswrapper[4703]: E0309 13:21:46.706656 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.708666 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.711124 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.711179 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.711192 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.711211 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.711258 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: E0309 13:21:46.722363 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.726134 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.726594 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.726658 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.726674 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.726692 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.726707 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: E0309 13:21:46.737148 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: E0309 13:21:46.737260 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.739165 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.749256 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.757346 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.757990 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.758025 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.758038 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.758055 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.758066 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.778176 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.786169 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.798045 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.805925 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.816445 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.827798 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.837598 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.848104 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.859873 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.860237 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.860286 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.860296 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.860312 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.860322 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.872652 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.882729 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.895729 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.905747 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.916063 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.933078 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.950017 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.962174 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.963119 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.963158 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.963167 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.963183 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.963192 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:46Z","lastTransitionTime":"2026-03-09T13:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.972162 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:46 crc kubenswrapper[4703]: I0309 13:21:46.996987 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:46Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.065928 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.065971 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.065981 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.065996 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.066007 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.168558 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.168617 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.168631 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.168648 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.168660 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.271272 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.271353 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.271372 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.271396 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.271414 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.374344 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.374385 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.374399 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.374420 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.374435 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.477472 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.477522 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.477537 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.477557 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.477572 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.580712 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.580762 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.580780 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.580802 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.580820 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.636518 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2gp76" event={"ID":"186829e9-3995-46ff-807c-06bf53e9a4e0","Type":"ContainerStarted","Data":"065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.638112 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.660259 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.683247 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.683310 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.683333 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.683361 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.683383 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.691599 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.706991 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:47 crc kubenswrapper[4703]: E0309 13:21:47.707158 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.706998 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:47 crc kubenswrapper[4703]: E0309 13:21:47.707339 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.727186 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.748998 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.767116 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.778977 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.786535 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.786614 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.786629 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.786651 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.786666 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.794531 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.815149 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.830382 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.846997 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.860469 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.874032 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.885735 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.889663 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.889713 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.889729 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.889749 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.889760 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.899511 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.910934 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:47Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.991867 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.991917 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.991927 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.991942 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:47 crc kubenswrapper[4703]: I0309 13:21:47.991954 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:47Z","lastTransitionTime":"2026-03-09T13:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.094083 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.094124 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.094135 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.094150 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.094160 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.196947 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.197003 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.197014 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.197029 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.197040 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.299588 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.299636 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.299645 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.299660 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.299669 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.402305 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.402418 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.402431 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.402447 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.402460 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.505199 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.505239 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.505251 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.505267 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.505279 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.607735 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.607784 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.607794 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.607862 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.607872 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.706295 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:48 crc kubenswrapper[4703]: E0309 13:21:48.706497 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.709613 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.709656 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.709667 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.709687 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.709699 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.811398 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.811427 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.811435 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.811449 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.811458 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.913134 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.913164 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.913172 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.913185 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:48 crc kubenswrapper[4703]: I0309 13:21:48.913193 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:48Z","lastTransitionTime":"2026-03-09T13:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.015777 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.015815 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.015826 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.015854 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.015866 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.118292 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.118336 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.118350 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.118370 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.118382 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.222050 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.222483 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.222506 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.222538 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.222560 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.324543 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.324584 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.324595 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.324610 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.324620 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.427257 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.427323 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.427335 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.427350 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.427360 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.529373 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.529422 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.529441 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.529464 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.529480 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.631634 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.631688 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.631705 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.631726 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.631742 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.645114 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/0.log" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.647799 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7" exitCode=1 Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.647826 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.648371 4703 scope.go:117] "RemoveContainer" containerID="91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.663294 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.674976 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.697401 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:49Z\\\",\\\"message\\\":\\\" 6439 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:21:49.107569 6439 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:21:49.107615 6439 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:21:49.107640 6439 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:21:49.107648 6439 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:21:49.107684 6439 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:21:49.107692 6439 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:21:49.107753 6439 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:21:49.107784 6439 factory.go:656] Stopping watch factory\\\\nI0309 13:21:49.107805 6439 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:21:49.107880 6439 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:21:49.107902 6439 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:21:49.107914 6439 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:21:49.107926 6439 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:21:49.107938 6439 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:21:49.107950 6439 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:21:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.706671 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.706802 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:49 crc kubenswrapper[4703]: E0309 13:21:49.706906 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:49 crc kubenswrapper[4703]: E0309 13:21:49.706994 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.734067 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.734099 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.734109 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.734125 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.734136 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.740576 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.779122 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.792050 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.808117 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.826520 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.836402 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.836435 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.836445 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.836458 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.836468 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.837358 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.854674 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.871519 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.884051 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.895737 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.907873 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.923606 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:49Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.944940 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.944976 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.944989 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.945014 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:49 crc kubenswrapper[4703]: I0309 13:21:49.945026 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:49Z","lastTransitionTime":"2026-03-09T13:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.047182 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.047232 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.047244 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.047261 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.047274 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.150643 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.150688 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.150699 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.150717 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.150730 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.153304 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.153796 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:06.153771241 +0000 UTC m=+122.121186947 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.253990 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254068 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254107 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254110 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254181 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254199 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254209 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254157 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254227 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254226 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254452 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254474 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254495 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254138 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254506 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254250 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:06.254238457 +0000 UTC m=+122.221654143 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254575 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254604 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:06.254570697 +0000 UTC m=+122.221986453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.254610 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254634 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:06.254618378 +0000 UTC m=+122.222034184 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.254665 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:06.254649029 +0000 UTC m=+122.222064845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.357175 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.357238 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.357257 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.357286 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.357304 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.459626 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.459684 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.459693 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.459707 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.459719 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.561877 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.561919 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.561927 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.561942 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.561951 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.653069 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/1.log" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.653757 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/0.log" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.656720 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4" exitCode=1 Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.656749 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.656796 4703 scope.go:117] "RemoveContainer" containerID="91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.658075 4703 scope.go:117] "RemoveContainer" containerID="15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4" Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.658549 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.664746 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.664779 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.664790 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.664805 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.664817 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.676767 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.704204 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch"] Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.704690 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.705097 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.708287 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.708422 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:21:50 crc kubenswrapper[4703]: E0309 13:21:50.708430 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.708651 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.724797 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.742324 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.756999 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.760448 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp62b\" (UniqueName: \"kubernetes.io/projected/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-kube-api-access-lp62b\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.760659 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.760786 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.760885 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.768443 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.768495 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.768512 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.768536 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.768555 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.788088 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:49Z\\\",\\\"message\\\":\\\" 6439 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:21:49.107569 6439 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:21:49.107615 6439 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:21:49.107640 6439 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:21:49.107648 6439 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:21:49.107684 6439 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:21:49.107692 6439 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:21:49.107753 6439 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:21:49.107784 6439 factory.go:656] Stopping watch factory\\\\nI0309 13:21:49.107805 6439 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:21:49.107880 6439 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:21:49.107902 6439 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:21:49.107914 6439 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:21:49.107926 6439 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:21:49.107938 6439 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:21:49.107950 6439 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:21:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.802082 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.813971 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.832725 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.847544 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.860821 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.861485 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp62b\" (UniqueName: \"kubernetes.io/projected/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-kube-api-access-lp62b\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.861639 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.861682 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.861721 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.863164 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.863609 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.871457 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.871515 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.871536 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.871564 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.871586 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.874530 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.884479 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.892558 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp62b\" (UniqueName: \"kubernetes.io/projected/42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e-kube-api-access-lp62b\") pod \"ovnkube-control-plane-749d76644c-2nrch\" (UID: \"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.904303 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.931752 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.945115 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.953923 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.974515 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.974566 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.974577 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.974597 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.974612 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:50Z","lastTransitionTime":"2026-03-09T13:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.976587 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:49Z\\\",\\\"message\\\":\\\" 6439 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:21:49.107569 6439 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:21:49.107615 6439 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:21:49.107640 6439 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:21:49.107648 6439 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:21:49.107684 6439 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:21:49.107692 6439 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:21:49.107753 6439 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:21:49.107784 6439 factory.go:656] Stopping watch factory\\\\nI0309 13:21:49.107805 6439 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:21:49.107880 6439 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:21:49.107902 6439 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:21:49.107914 6439 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:21:49.107926 6439 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:21:49.107938 6439 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:21:49.107950 6439 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:21:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:50 crc kubenswrapper[4703]: I0309 13:21:50.997687 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:50Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.024321 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.038700 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.056218 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.068073 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.080545 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.080582 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.080593 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.080609 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.080619 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.087675 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.102244 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.115413 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.125764 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.139403 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.156687 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.173082 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.182752 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.182798 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.182811 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.182832 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.182873 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.198518 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.210603 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.224678 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.286013 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.286065 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.286080 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.286100 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.286117 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.389001 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.389053 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.389071 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.389096 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.389113 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.491548 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.491590 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.491602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.491618 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.491630 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.594188 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.594220 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.594230 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.594245 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.594256 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.661413 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/1.log" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.664734 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" event={"ID":"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e","Type":"ContainerStarted","Data":"0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.664772 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" event={"ID":"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e","Type":"ContainerStarted","Data":"2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.664785 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" event={"ID":"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e","Type":"ContainerStarted","Data":"ae4803c30c687c9cbe2616a197201082d44b26cc40a9c517cf54854e1679ba22"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.676212 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.685048 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.695961 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.695990 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.696005 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.696021 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.696035 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.697029 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.706453 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:51 crc kubenswrapper[4703]: E0309 13:21:51.706527 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.706461 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:51 crc kubenswrapper[4703]: E0309 13:21:51.706658 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.709451 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.719863 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.734744 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.745442 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.755423 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.769018 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.790393 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:49Z\\\",\\\"message\\\":\\\" 6439 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:21:49.107569 6439 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:21:49.107615 6439 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:21:49.107640 6439 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:21:49.107648 6439 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:21:49.107684 6439 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:21:49.107692 6439 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:21:49.107753 6439 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:21:49.107784 6439 factory.go:656] Stopping watch factory\\\\nI0309 13:21:49.107805 6439 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:21:49.107880 6439 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:21:49.107902 6439 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:21:49.107914 6439 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:21:49.107926 6439 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:21:49.107938 6439 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:21:49.107950 6439 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:21:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.798150 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.798193 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.798204 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.798223 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.798234 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.813460 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.829310 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.842964 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.856168 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.871838 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.882756 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:51Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.900662 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.900723 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.900743 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.900769 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:51 crc kubenswrapper[4703]: I0309 13:21:51.900787 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:51Z","lastTransitionTime":"2026-03-09T13:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.003827 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.003876 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.003892 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.003906 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.003916 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.107166 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.107203 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.107220 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.107239 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.107253 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.209536 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.209572 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.209584 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.209601 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.209613 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.312209 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.312249 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.312259 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.312275 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.312284 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.415150 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.415200 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.415213 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.415231 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.415245 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.517347 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.517396 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.517407 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.517425 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.517438 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.582496 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jlgk5"] Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.583018 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:52 crc kubenswrapper[4703]: E0309 13:21:52.583091 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.596240 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.609719 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.619631 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.619673 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.619687 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.619710 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.619725 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.622143 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.639415 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.652789 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.664489 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.680741 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2c4w\" (UniqueName: \"kubernetes.io/projected/967e7a44-ac71-42aa-9847-37799ff35cc0-kube-api-access-g2c4w\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.680800 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.686053 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.700457 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.706613 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:52 crc kubenswrapper[4703]: E0309 13:21:52.706740 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.717058 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.721711 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.721754 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.721766 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.721790 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.721807 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.732507 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.744147 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.757903 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.766732 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.781949 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.782084 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2c4w\" (UniqueName: \"kubernetes.io/projected/967e7a44-ac71-42aa-9847-37799ff35cc0-kube-api-access-g2c4w\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:52 crc kubenswrapper[4703]: E0309 13:21:52.782124 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:52 crc kubenswrapper[4703]: E0309 13:21:52.782202 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:53.282185075 +0000 UTC m=+109.249600761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.783702 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:49Z\\\",\\\"message\\\":\\\" 6439 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:21:49.107569 6439 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:21:49.107615 6439 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:21:49.107640 6439 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:21:49.107648 6439 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:21:49.107684 6439 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:21:49.107692 6439 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:21:49.107753 6439 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:21:49.107784 6439 factory.go:656] Stopping watch factory\\\\nI0309 13:21:49.107805 6439 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:21:49.107880 6439 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:21:49.107902 6439 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:21:49.107914 6439 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:21:49.107926 6439 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:21:49.107938 6439 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:21:49.107950 6439 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:21:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.793107 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.797940 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2c4w\" (UniqueName: \"kubernetes.io/projected/967e7a44-ac71-42aa-9847-37799ff35cc0-kube-api-access-g2c4w\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.810382 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.822172 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:52Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.823915 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.823957 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.823965 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.824008 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.824023 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.926661 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.926701 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.926713 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.926728 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:52 crc kubenswrapper[4703]: I0309 13:21:52.926738 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:52Z","lastTransitionTime":"2026-03-09T13:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.028794 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.028876 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.028900 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.028929 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.028950 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.131067 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.131134 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.131157 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.131185 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.131214 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.234104 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.234136 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.234145 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.234159 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.234168 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.287289 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:53 crc kubenswrapper[4703]: E0309 13:21:53.287534 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:53 crc kubenswrapper[4703]: E0309 13:21:53.287645 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:54.287620614 +0000 UTC m=+110.255036310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.338251 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.338321 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.338342 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.338369 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.338389 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.440803 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.440884 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.440899 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.440917 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.440929 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.543426 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.543461 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.543469 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.543482 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.543494 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.646057 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.646105 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.646119 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.646134 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.646145 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.706630 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.706701 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.706637 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:53 crc kubenswrapper[4703]: E0309 13:21:53.706800 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:21:53 crc kubenswrapper[4703]: E0309 13:21:53.706972 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:53 crc kubenswrapper[4703]: E0309 13:21:53.707093 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.749028 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.749068 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.749077 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.749092 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.749102 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.852749 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.852816 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.852833 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.852879 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.852897 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.956740 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.956811 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.956833 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.956894 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:53 crc kubenswrapper[4703]: I0309 13:21:53.956920 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:53Z","lastTransitionTime":"2026-03-09T13:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.060530 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.060680 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.060733 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.060768 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.060791 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.164081 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.164158 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.164181 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.164211 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.164233 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.266880 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.266932 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.266945 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.266963 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.266976 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.298463 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:54 crc kubenswrapper[4703]: E0309 13:21:54.298638 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:54 crc kubenswrapper[4703]: E0309 13:21:54.298744 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:21:56.298721739 +0000 UTC m=+112.266137535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.370261 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.370308 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.370320 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.370336 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.370349 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.472706 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.472765 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.472786 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.472815 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.472837 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.575191 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.575267 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.575297 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.575340 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.575364 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.677930 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.678006 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.678030 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.678066 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.678091 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.706929 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:54 crc kubenswrapper[4703]: E0309 13:21:54.707101 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.724012 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.737235 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.748608 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.762261 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.775654 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.780215 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.780249 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.780261 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.780280 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.780292 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.790331 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.804045 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.817630 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.837664 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.848914 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.871486 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:49Z\\\",\\\"message\\\":\\\" 6439 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:21:49.107569 6439 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:21:49.107615 6439 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:21:49.107640 6439 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:21:49.107648 6439 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:21:49.107684 6439 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:21:49.107692 6439 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:21:49.107753 6439 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:21:49.107784 6439 factory.go:656] Stopping watch factory\\\\nI0309 13:21:49.107805 6439 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:21:49.107880 6439 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:21:49.107902 6439 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:21:49.107914 6439 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:21:49.107926 6439 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:21:49.107938 6439 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:21:49.107950 6439 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:21:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.882361 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.882401 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.882413 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.882435 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.882451 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.883224 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.905801 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.925803 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.937596 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.950535 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.960699 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:54Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.984569 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.984620 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.984635 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.984654 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:54 crc kubenswrapper[4703]: I0309 13:21:54.984666 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:54Z","lastTransitionTime":"2026-03-09T13:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.087814 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.087897 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.087922 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.087949 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.087971 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.189949 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.189999 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.190010 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.190028 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.190040 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.292171 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.292231 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.292248 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.292269 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.292287 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.394170 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.394207 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.394215 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.394227 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.394235 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.497884 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.497959 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.497977 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.498003 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.498019 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.600797 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.600873 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.600883 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.600897 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.600907 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.703235 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.703301 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.703324 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.703352 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.703373 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.706661 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.706667 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:55 crc kubenswrapper[4703]: E0309 13:21:55.706998 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:55 crc kubenswrapper[4703]: E0309 13:21:55.707078 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.706715 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:55 crc kubenswrapper[4703]: E0309 13:21:55.707294 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.806904 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.807202 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.807337 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.807471 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.807590 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.911164 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.911583 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.911893 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.912018 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:55 crc kubenswrapper[4703]: I0309 13:21:55.912113 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:55Z","lastTransitionTime":"2026-03-09T13:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.015183 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.015226 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.015238 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.015253 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.015263 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.118732 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.118767 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.118780 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.118801 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.118812 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.221407 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.221477 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.221525 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.221553 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.221575 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.318918 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:56 crc kubenswrapper[4703]: E0309 13:21:56.319187 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:56 crc kubenswrapper[4703]: E0309 13:21:56.319302 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:00.31927403 +0000 UTC m=+116.286689777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.324723 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.325014 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.325044 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.325075 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.325099 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.438490 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.438530 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.438542 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.438557 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.438570 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.542125 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.542203 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.542277 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.542300 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.542355 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.645971 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.646035 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.646058 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.646089 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.646111 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.707038 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:56 crc kubenswrapper[4703]: E0309 13:21:56.707211 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.708169 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:21:56 crc kubenswrapper[4703]: E0309 13:21:56.708647 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.748657 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.748714 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.748730 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.748755 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.748772 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.852537 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.852611 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.852628 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.852652 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.852670 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.955453 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.955509 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.955526 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.955549 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.955566 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.999707 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.999766 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:56 crc kubenswrapper[4703]: I0309 13:21:56.999788 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:56.999814 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:56.999836 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:56Z","lastTransitionTime":"2026-03-09T13:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.021539 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.026687 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.026754 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.026778 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.026807 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.026824 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.047139 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.052586 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.052639 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.052660 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.052686 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.052723 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.081207 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.086133 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.086204 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.086226 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.086257 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.086280 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.107669 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.113220 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.113283 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.113302 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.113331 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.113354 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.133101 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:21:57Z is after 2025-08-24T17:21:41Z" Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.133335 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.135502 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.135562 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.135580 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.135602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.135618 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.239435 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.239519 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.239545 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.239576 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.239595 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.342666 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.342722 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.342740 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.342764 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.342785 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.445826 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.445924 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.445941 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.445964 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.445981 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.549006 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.549052 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.549069 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.549092 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.549109 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.651516 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.651561 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.651578 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.651602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.651621 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.706652 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.706676 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.706802 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.706827 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.707003 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:21:57 crc kubenswrapper[4703]: E0309 13:21:57.707094 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.754695 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.754756 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.754775 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.754800 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.754823 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.857436 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.857500 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.857517 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.857544 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.857562 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.960454 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.960533 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.960560 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.960590 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:57 crc kubenswrapper[4703]: I0309 13:21:57.960614 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:57Z","lastTransitionTime":"2026-03-09T13:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.064306 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.064384 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.064403 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.064428 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.064470 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.167553 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.167627 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.167650 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.167677 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.167758 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.270789 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.270834 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.270889 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.270934 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.270950 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.374211 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.374276 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.374294 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.374317 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.374335 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.478089 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.478147 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.478164 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.478190 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.478207 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.581250 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.581299 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.581315 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.581335 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.581350 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.684473 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.684548 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.684573 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.684664 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.684689 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.706673 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:21:58 crc kubenswrapper[4703]: E0309 13:21:58.706956 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.787604 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.787679 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.787701 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.787730 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.787753 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.890557 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.890640 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.890660 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.890687 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.890738 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.994428 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.994492 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.994517 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.994548 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:58 crc kubenswrapper[4703]: I0309 13:21:58.994571 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:58Z","lastTransitionTime":"2026-03-09T13:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.097277 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.097337 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.097354 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.097378 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.097397 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.200215 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.200277 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.200294 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.200316 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.200334 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.308284 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.308382 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.308404 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.308434 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.308459 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.411926 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.412002 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.412026 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.412058 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.412079 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.514827 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.514901 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.514918 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.514942 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.514961 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.617629 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.617671 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.617685 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.617701 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.617713 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.706119 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.706192 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.706246 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:21:59 crc kubenswrapper[4703]: E0309 13:21:59.706561 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:21:59 crc kubenswrapper[4703]: E0309 13:21:59.706693 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:21:59 crc kubenswrapper[4703]: E0309 13:21:59.706921 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.719926 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.720008 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.720026 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.720079 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.720096 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.823789 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.823882 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.823906 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.823935 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.823959 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.927586 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.927634 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.927651 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.927673 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:21:59 crc kubenswrapper[4703]: I0309 13:21:59.927691 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:21:59Z","lastTransitionTime":"2026-03-09T13:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.030167 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.030257 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.030280 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.030349 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.030367 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.134493 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.134540 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.134557 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.134580 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.134596 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.237406 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.237479 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.237491 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.237532 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.237544 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.341060 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.341116 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.341133 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.341156 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.341173 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.386152 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:00 crc kubenswrapper[4703]: E0309 13:22:00.386396 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:00 crc kubenswrapper[4703]: E0309 13:22:00.386480 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:08.386454838 +0000 UTC m=+124.353870574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.444287 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.444338 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.444355 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.444381 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.444400 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.547355 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.547415 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.547438 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.547470 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.547491 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.651183 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.651261 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.651278 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.651305 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.651324 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.706432 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:00 crc kubenswrapper[4703]: E0309 13:22:00.706616 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.754477 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.754535 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.754555 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.754579 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.754597 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.857379 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.857425 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.857442 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.857460 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.857472 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.960874 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.960963 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.960988 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.961014 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:00 crc kubenswrapper[4703]: I0309 13:22:00.961032 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:00Z","lastTransitionTime":"2026-03-09T13:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.064371 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.064438 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.064451 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.064475 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.064493 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.168532 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.168605 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.168630 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.168661 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.168683 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.271725 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.271800 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.271816 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.271841 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.271943 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.375652 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.375739 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.375758 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.375798 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.375820 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.478638 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.478694 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.478712 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.478735 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.478755 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.581260 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.581316 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.581333 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.581355 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.581375 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.686004 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.686068 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.686086 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.686108 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.686129 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.706368 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.706432 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.706378 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:01 crc kubenswrapper[4703]: E0309 13:22:01.706650 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:01 crc kubenswrapper[4703]: E0309 13:22:01.706743 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:01 crc kubenswrapper[4703]: E0309 13:22:01.707110 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.719528 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.789729 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.789783 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.789801 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.789823 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.789862 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.893151 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.893212 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.893244 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.893268 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.893287 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.997117 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.997176 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.997193 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.997217 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:01 crc kubenswrapper[4703]: I0309 13:22:01.997235 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:01Z","lastTransitionTime":"2026-03-09T13:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.100957 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.101114 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.101141 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.101171 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.101190 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.204508 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.204571 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.204587 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.204612 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.204630 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.308255 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.308329 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.308347 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.308372 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.308389 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.410981 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.411046 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.411059 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.411078 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.411093 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.514131 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.514185 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.514197 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.514220 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.514234 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.616278 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.616530 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.616603 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.616672 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.616801 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.706352 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:02 crc kubenswrapper[4703]: E0309 13:22:02.706556 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.719363 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.719423 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.719441 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.719467 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.719490 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.822712 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.822786 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.822808 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.822873 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.822897 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.925564 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.925818 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.925910 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.925998 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:02 crc kubenswrapper[4703]: I0309 13:22:02.926073 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:02Z","lastTransitionTime":"2026-03-09T13:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.029700 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.029794 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.029812 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.029835 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.029887 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.132694 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.132769 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.132787 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.132812 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.132829 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.235670 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.235784 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.235804 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.235830 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.235902 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.339052 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.339177 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.339198 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.339264 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.339284 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.442074 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.442175 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.442202 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.442229 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.442250 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.544894 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.544949 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.544965 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.544990 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.545009 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.647816 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.647922 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.647950 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.647979 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.648000 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.705885 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.705935 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.705884 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:03 crc kubenswrapper[4703]: E0309 13:22:03.706111 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:03 crc kubenswrapper[4703]: E0309 13:22:03.706193 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:03 crc kubenswrapper[4703]: E0309 13:22:03.706632 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.713026 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.750784 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.750891 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.750921 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.750953 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.750973 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.854501 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.854602 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.854617 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.854636 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.854671 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.957801 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.957890 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.957908 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.957932 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:03 crc kubenswrapper[4703]: I0309 13:22:03.957949 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:03Z","lastTransitionTime":"2026-03-09T13:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.061489 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.061566 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.061587 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.061617 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.061639 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:04Z","lastTransitionTime":"2026-03-09T13:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.164524 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.164599 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.164620 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.164648 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.164669 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:04Z","lastTransitionTime":"2026-03-09T13:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.268456 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.268533 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.268557 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.268593 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.268615 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:04Z","lastTransitionTime":"2026-03-09T13:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.371335 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.371395 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.371413 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.371436 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.371452 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:04Z","lastTransitionTime":"2026-03-09T13:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.474810 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.474897 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.474915 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.474934 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.474945 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:04Z","lastTransitionTime":"2026-03-09T13:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.577668 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.577746 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.577773 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.577802 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.577825 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:04Z","lastTransitionTime":"2026-03-09T13:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:04 crc kubenswrapper[4703]: E0309 13:22:04.678774 4703 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.706403 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:04 crc kubenswrapper[4703]: E0309 13:22:04.706604 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.726061 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.740674 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.763497 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91038af82b7824d780df56614bd8d1d000dabd1f1ef6d32835ef0d81855f47d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:49Z\\\",\\\"message\\\":\\\" 6439 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:21:49.107569 6439 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:21:49.107615 6439 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:21:49.107640 6439 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:21:49.107648 6439 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:21:49.107684 6439 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:21:49.107692 6439 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:21:49.107753 6439 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:21:49.107784 6439 factory.go:656] Stopping watch factory\\\\nI0309 13:21:49.107805 6439 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:21:49.107880 6439 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:21:49.107902 6439 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0309 13:21:49.107914 6439 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:21:49.107926 6439 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:21:49.107938 6439 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:21:49.107950 6439 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:21:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.777073 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.791405 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.810479 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: E0309 13:22:04.812165 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.841957 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.861972 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.878250 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.894196 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.907369 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.923784 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.940456 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.953377 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.964913 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.986281 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:04 crc kubenswrapper[4703]: I0309 13:22:04.999295 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:04Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.010291 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.025155 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.706156 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.706258 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:05 crc kubenswrapper[4703]: E0309 13:22:05.706423 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.706482 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:05 crc kubenswrapper[4703]: E0309 13:22:05.706508 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:05 crc kubenswrapper[4703]: E0309 13:22:05.707175 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.707681 4703 scope.go:117] "RemoveContainer" containerID="15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.730164 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.754008 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.771784 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.790390 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.809087 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.826351 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.844136 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.858558 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.872544 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.896900 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.907964 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.919243 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.932856 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.955874 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.972860 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:05 crc kubenswrapper[4703]: I0309 13:22:05.988910 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.001930 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:05Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.018728 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.030806 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.155423 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.155563 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:22:38.155543983 +0000 UTC m=+154.122959669 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.256876 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.256918 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.256937 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.256968 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257064 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257111 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:38.257096789 +0000 UTC m=+154.224512475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257156 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257199 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257240 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257254 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257274 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:38.257246284 +0000 UTC m=+154.224662010 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257308 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:38.257288925 +0000 UTC m=+154.224704661 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257429 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257454 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257475 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.257522 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:38.257506791 +0000 UTC m=+154.224922517 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.706570 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.706799 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.719361 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/2.log" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.720441 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/1.log" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.724697 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f" exitCode=1 Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.724751 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f"} Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.724830 4703 scope.go:117] "RemoveContainer" containerID="15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.726027 4703 scope.go:117] "RemoveContainer" containerID="b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f" Mar 09 13:22:06 crc kubenswrapper[4703]: E0309 13:22:06.726394 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.748805 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.768267 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.789816 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.808030 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.831777 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.852893 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.872266 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.888966 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.918525 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15f32ff09d3105428df57dfffe9686059fd157c5103baba5f42f041855ae49f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"message\\\":\\\"rvices.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.38\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0309 13:21:50.593183 6644 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-service-ca-operator/metrics]} name:Service_openshift-service-ca-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.40:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2a3fb1a3-a476-4e14-bcf5-fb79af60206a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 13:21:50.592946 6644 services_controller.go:443] \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.937990 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.960634 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:06 crc kubenswrapper[4703]: I0309 13:22:06.979624 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:06Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.010312 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.034096 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.053405 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.068227 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.084536 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.104150 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.133770 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.287363 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.287405 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.287417 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.287435 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.287449 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:07Z","lastTransitionTime":"2026-03-09T13:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.305709 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.310601 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.310698 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.310711 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.310729 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.310742 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:07Z","lastTransitionTime":"2026-03-09T13:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.328990 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.333565 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.333604 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.333642 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.333661 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.333672 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:07Z","lastTransitionTime":"2026-03-09T13:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.353494 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.359072 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.359149 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.359171 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.359199 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.359216 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:07Z","lastTransitionTime":"2026-03-09T13:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.378378 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.382482 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.382540 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.382555 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.382572 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.382584 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:07Z","lastTransitionTime":"2026-03-09T13:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.396292 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:07Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.396468 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.706012 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.706012 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.706276 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.706045 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.706399 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:07 crc kubenswrapper[4703]: E0309 13:22:07.706588 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:07 crc kubenswrapper[4703]: I0309 13:22:07.731088 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/2.log" Mar 09 13:22:08 crc kubenswrapper[4703]: I0309 13:22:08.480197 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:08 crc kubenswrapper[4703]: E0309 13:22:08.480457 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:08 crc kubenswrapper[4703]: E0309 13:22:08.480578 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:24.480546921 +0000 UTC m=+140.447962637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:08 crc kubenswrapper[4703]: I0309 13:22:08.705968 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:08 crc kubenswrapper[4703]: E0309 13:22:08.706169 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.527108 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.528301 4703 scope.go:117] "RemoveContainer" containerID="b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f" Mar 09 13:22:09 crc kubenswrapper[4703]: E0309 13:22:09.528609 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.547988 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.569303 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.585425 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.612251 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.627583 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.643324 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.662604 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.677064 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.690355 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.706158 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.706235 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.706307 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:09 crc kubenswrapper[4703]: E0309 13:22:09.706380 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.706454 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:09 crc kubenswrapper[4703]: E0309 13:22:09.706518 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:09 crc kubenswrapper[4703]: E0309 13:22:09.706636 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.721433 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.733289 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.752874 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.766756 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.779112 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.791885 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.812572 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: E0309 13:22:09.813836 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.825303 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:09 crc kubenswrapper[4703]: I0309 13:22:09.840042 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:09Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:10 crc kubenswrapper[4703]: I0309 13:22:10.706705 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:10 crc kubenswrapper[4703]: E0309 13:22:10.706981 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:11 crc kubenswrapper[4703]: I0309 13:22:11.706209 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:11 crc kubenswrapper[4703]: I0309 13:22:11.706348 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:11 crc kubenswrapper[4703]: I0309 13:22:11.706600 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:11 crc kubenswrapper[4703]: E0309 13:22:11.706711 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:11 crc kubenswrapper[4703]: E0309 13:22:11.706974 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:11 crc kubenswrapper[4703]: I0309 13:22:11.707108 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:22:11 crc kubenswrapper[4703]: E0309 13:22:11.707105 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.706923 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:12 crc kubenswrapper[4703]: E0309 13:22:12.707452 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.753758 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.755709 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99"} Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.756267 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.768209 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.780583 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.809986 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.826901 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.845764 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.859376 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.894081 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.911173 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.925161 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.938319 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.951672 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.966875 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.981336 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:12 crc kubenswrapper[4703]: I0309 13:22:12.994799 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:12Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.010051 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.025083 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.037347 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.062755 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.084403 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:13Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.706617 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.706737 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:13 crc kubenswrapper[4703]: I0309 13:22:13.706627 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:13 crc kubenswrapper[4703]: E0309 13:22:13.706870 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:13 crc kubenswrapper[4703]: E0309 13:22:13.706963 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:13 crc kubenswrapper[4703]: E0309 13:22:13.707043 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.706446 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:14 crc kubenswrapper[4703]: E0309 13:22:14.706706 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.725800 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.740916 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.760385 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.778596 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.800304 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: E0309 13:22:14.815010 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.826267 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.848886 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.868115 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.880446 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.894751 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.911171 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.941556 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.960688 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.973500 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:14 crc kubenswrapper[4703]: I0309 13:22:14.986674 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:14Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:15 crc kubenswrapper[4703]: I0309 13:22:15.018960 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:15 crc kubenswrapper[4703]: I0309 13:22:15.031808 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:15 crc kubenswrapper[4703]: I0309 13:22:15.051824 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:15 crc kubenswrapper[4703]: I0309 13:22:15.065792 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:15Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:15 crc kubenswrapper[4703]: I0309 13:22:15.706569 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:15 crc kubenswrapper[4703]: I0309 13:22:15.706641 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:15 crc kubenswrapper[4703]: I0309 13:22:15.706569 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:15 crc kubenswrapper[4703]: E0309 13:22:15.706757 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:15 crc kubenswrapper[4703]: E0309 13:22:15.706950 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:15 crc kubenswrapper[4703]: E0309 13:22:15.707249 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:16 crc kubenswrapper[4703]: I0309 13:22:16.706424 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:16 crc kubenswrapper[4703]: E0309 13:22:16.706638 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.608997 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.609080 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.609100 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.609128 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.609150 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.629289 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.634677 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.634816 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.634877 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.634915 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.634943 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.655215 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.660059 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.660123 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.660144 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.660166 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.660183 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.678959 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.683532 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.683596 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.683619 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.683647 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.683670 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.703951 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.706042 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.706114 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.706042 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.706260 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.706476 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.706631 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.710712 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.710807 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.710826 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.710881 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:17 crc kubenswrapper[4703]: I0309 13:22:17.710900 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:17Z","lastTransitionTime":"2026-03-09T13:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.732603 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:17Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:17 crc kubenswrapper[4703]: E0309 13:22:17.732829 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:18 crc kubenswrapper[4703]: I0309 13:22:18.706673 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:18 crc kubenswrapper[4703]: E0309 13:22:18.706953 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:19 crc kubenswrapper[4703]: I0309 13:22:19.706153 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:19 crc kubenswrapper[4703]: I0309 13:22:19.706178 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:19 crc kubenswrapper[4703]: I0309 13:22:19.706178 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:19 crc kubenswrapper[4703]: E0309 13:22:19.706466 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:19 crc kubenswrapper[4703]: E0309 13:22:19.706301 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:19 crc kubenswrapper[4703]: E0309 13:22:19.706546 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:19 crc kubenswrapper[4703]: E0309 13:22:19.816764 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:20 crc kubenswrapper[4703]: I0309 13:22:20.706607 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:20 crc kubenswrapper[4703]: E0309 13:22:20.707452 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:20 crc kubenswrapper[4703]: I0309 13:22:20.707964 4703 scope.go:117] "RemoveContainer" containerID="b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f" Mar 09 13:22:20 crc kubenswrapper[4703]: E0309 13:22:20.708303 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:22:21 crc kubenswrapper[4703]: I0309 13:22:21.706343 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:21 crc kubenswrapper[4703]: I0309 13:22:21.706343 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:21 crc kubenswrapper[4703]: E0309 13:22:21.706562 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:21 crc kubenswrapper[4703]: E0309 13:22:21.706660 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:21 crc kubenswrapper[4703]: I0309 13:22:21.706374 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:21 crc kubenswrapper[4703]: E0309 13:22:21.706792 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:22 crc kubenswrapper[4703]: I0309 13:22:22.706116 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:22 crc kubenswrapper[4703]: E0309 13:22:22.706268 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:23 crc kubenswrapper[4703]: I0309 13:22:23.706366 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:23 crc kubenswrapper[4703]: I0309 13:22:23.706450 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:23 crc kubenswrapper[4703]: I0309 13:22:23.706516 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:23 crc kubenswrapper[4703]: E0309 13:22:23.706636 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:23 crc kubenswrapper[4703]: E0309 13:22:23.706766 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:23 crc kubenswrapper[4703]: E0309 13:22:23.706989 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.581300 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:24 crc kubenswrapper[4703]: E0309 13:22:24.581674 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:24 crc kubenswrapper[4703]: E0309 13:22:24.582209 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:22:56.582165036 +0000 UTC m=+172.549580762 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.706158 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:24 crc kubenswrapper[4703]: E0309 13:22:24.706643 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.722606 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.741567 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.756584 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.790680 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.810700 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: E0309 13:22:24.817358 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.832243 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.847767 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.883391 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.898785 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.945314 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.965933 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.977812 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:24 crc kubenswrapper[4703]: I0309 13:22:24.995543 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:24Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.007926 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.022551 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.045704 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.062661 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.084345 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.101703 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:25Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.706381 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.706381 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:25 crc kubenswrapper[4703]: E0309 13:22:25.706697 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:25 crc kubenswrapper[4703]: E0309 13:22:25.706545 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:25 crc kubenswrapper[4703]: I0309 13:22:25.707045 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:25 crc kubenswrapper[4703]: E0309 13:22:25.707335 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.706226 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:26 crc kubenswrapper[4703]: E0309 13:22:26.706378 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.811605 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/0.log" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.811714 4703 generic.go:334] "Generic (PLEG): container finished" podID="d59f2278-9dbc-48bb-8d56-fa9da4183118" containerID="6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526" exitCode=1 Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.811777 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerDied","Data":"6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526"} Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.812743 4703 scope.go:117] "RemoveContainer" containerID="6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.832548 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.859118 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.882975 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"2026-03-09T13:21:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9\\\\n2026-03-09T13:21:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9 to /host/opt/cni/bin/\\\\n2026-03-09T13:21:41Z [verbose] multus-daemon started\\\\n2026-03-09T13:21:41Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:22:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.899621 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.920758 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.940060 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.955829 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.972165 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:26 crc kubenswrapper[4703]: I0309 13:22:26.991658 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:26Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.010287 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.026408 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.057910 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.071047 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.084888 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.098258 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.127523 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.138520 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.154122 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.165431 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.706632 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.706685 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.706818 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.706654 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.707004 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.707231 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.809495 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.809547 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.809565 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.809583 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.809597 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.819344 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/0.log" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.819469 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerStarted","Data":"d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f"} Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.828296 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.833477 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.833529 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.833542 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.833563 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.833575 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.839081 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.849638 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.853435 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.855536 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.855625 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.855645 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.855673 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.855696 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.868161 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.871143 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.875657 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.875698 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.875712 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.875730 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.875743 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.883687 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.892082 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.896251 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.896293 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.896218 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.896302 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.896520 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.896543 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:27Z","lastTransitionTime":"2026-03-09T13:22:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.913196 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: E0309 13:22:27.913316 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.915647 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.929432 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.955972 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.971578 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"2026-03-09T13:21:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9\\\\n2026-03-09T13:21:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9 to /host/opt/cni/bin/\\\\n2026-03-09T13:21:41Z [verbose] multus-daemon started\\\\n2026-03-09T13:21:41Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:22:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.982954 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:27 crc kubenswrapper[4703]: I0309 13:22:27.993219 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:27Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.009023 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.021158 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.033516 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.045429 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.063642 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.079303 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.090356 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.101294 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.566096 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.583471 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.597827 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.610834 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.631994 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.649468 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.664442 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.676151 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.700714 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.706901 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:28 crc kubenswrapper[4703]: E0309 13:22:28.707074 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.714511 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.727166 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.743496 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.756998 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.769135 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.787678 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.804689 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.816289 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.833538 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.851341 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"2026-03-09T13:21:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9\\\\n2026-03-09T13:21:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9 to /host/opt/cni/bin/\\\\n2026-03-09T13:21:41Z [verbose] multus-daemon started\\\\n2026-03-09T13:21:41Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:22:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:28 crc kubenswrapper[4703]: I0309 13:22:28.869423 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:28Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:29 crc kubenswrapper[4703]: I0309 13:22:29.706640 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:29 crc kubenswrapper[4703]: I0309 13:22:29.706667 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:29 crc kubenswrapper[4703]: I0309 13:22:29.706767 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:29 crc kubenswrapper[4703]: E0309 13:22:29.706889 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:29 crc kubenswrapper[4703]: E0309 13:22:29.706983 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:29 crc kubenswrapper[4703]: E0309 13:22:29.707115 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:29 crc kubenswrapper[4703]: E0309 13:22:29.819333 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:30 crc kubenswrapper[4703]: I0309 13:22:30.706545 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:30 crc kubenswrapper[4703]: E0309 13:22:30.706765 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.706392 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.706403 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:31 crc kubenswrapper[4703]: E0309 13:22:31.706617 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.706654 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:31 crc kubenswrapper[4703]: E0309 13:22:31.707019 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:31 crc kubenswrapper[4703]: E0309 13:22:31.707217 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.707511 4703 scope.go:117] "RemoveContainer" containerID="b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.835579 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/2.log" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.839167 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.840109 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.865073 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.885179 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.901544 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.922575 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.941650 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"2026-03-09T13:21:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9\\\\n2026-03-09T13:21:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9 to /host/opt/cni/bin/\\\\n2026-03-09T13:21:41Z [verbose] multus-daemon started\\\\n2026-03-09T13:21:41Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:22:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.955513 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.967887 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.980679 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:31 crc kubenswrapper[4703]: I0309 13:22:31.994709 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:31Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.019130 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.035386 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.055033 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.067039 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.087980 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.099546 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.118859 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.133006 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.144787 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.154234 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.706676 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:32 crc kubenswrapper[4703]: E0309 13:22:32.706918 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.845597 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/3.log" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.846495 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/2.log" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.850319 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" exitCode=1 Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.850387 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.850452 4703 scope.go:117] "RemoveContainer" containerID="b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.853674 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:22:32 crc kubenswrapper[4703]: E0309 13:22:32.854368 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.882210 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.897485 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.914967 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.934555 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"2026-03-09T13:21:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9\\\\n2026-03-09T13:21:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9 to /host/opt/cni/bin/\\\\n2026-03-09T13:21:41Z [verbose] multus-daemon started\\\\n2026-03-09T13:21:41Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:22:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.950263 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.969669 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.983153 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:32 crc kubenswrapper[4703]: I0309 13:22:32.998132 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:32Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.020469 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.040392 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.058293 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.072319 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.102970 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7549fc87aaa8a0db4824aaedaf81c2e7204df22707b151d2e1f5a468ca3fe9f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:06Z\\\",\\\"message\\\":\\\"on-jlgk5]\\\\nI0309 13:22:06.598741 6882 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0309 13:22:06.598732 6882 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 13:22:06.598755 6882 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-jlgk5 before timer (time: 2026-03-09 13:22:07.940244381 +0000 UTC m=+1.903284629): skip\\\\nI0309 13:22:06.598765 6882 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 36.541µs)\\\\nI0309 13:22:06.598780 6882 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:06.598799 6882 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 13:22:06.598830 6882 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:06.598866 6882 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:06.598955 6882 factory.go:656] Stopping watch factory\\\\nI0309 13:22:06.598977 6882 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:06.598981 6882 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:06.599054 6882 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 13:22:06.599159 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:22:32.559419 7188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:32.559434 7188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:32.559470 7188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:22:32.559522 7188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:32.559531 7188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:32.559541 7188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:22:32.559551 7188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:22:32.559573 7188 factory.go:656] Stopping watch factory\\\\nI0309 13:22:32.559586 7188 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:32.559612 7188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:32.559625 7188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:22:32.559630 7188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:32.559637 7188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:32.559642 7188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:32.559647 7188 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.119723 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.140483 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.160012 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.177067 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.192607 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.209156 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.706516 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.706596 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.706654 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:33 crc kubenswrapper[4703]: E0309 13:22:33.706694 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:33 crc kubenswrapper[4703]: E0309 13:22:33.707012 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:33 crc kubenswrapper[4703]: E0309 13:22:33.707196 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.856328 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/3.log" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.861243 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:22:33 crc kubenswrapper[4703]: E0309 13:22:33.861452 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.881975 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.895441 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.915649 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.934169 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.968521 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:33 crc kubenswrapper[4703]: I0309 13:22:33.985220 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:33Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.003203 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.016918 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.035874 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:22:32.559419 7188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:32.559434 7188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:32.559470 7188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:22:32.559522 7188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:32.559531 7188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:32.559541 7188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:22:32.559551 7188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:22:32.559573 7188 factory.go:656] Stopping watch factory\\\\nI0309 13:22:32.559586 7188 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:32.559612 7188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:32.559625 7188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:22:32.559630 7188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:32.559637 7188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:32.559642 7188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:32.559647 7188 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.048923 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.064462 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.081750 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.094199 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.109172 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.122268 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.133437 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.144262 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.156644 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.168451 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"2026-03-09T13:21:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9\\\\n2026-03-09T13:21:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9 to /host/opt/cni/bin/\\\\n2026-03-09T13:21:41Z [verbose] multus-daemon started\\\\n2026-03-09T13:21:41Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:22:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.706567 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:34 crc kubenswrapper[4703]: E0309 13:22:34.706906 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.731579 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.750952 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42dec49c-ba2a-4e7e-a8fa-d1372e7aeb8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e6a24cef765adee16f2208a4524331dfabdb46b2c9581f4fbe5e4c21e3fd394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac00693160cbdda2df9551e4cf1a43dfd092f407a573bc1a9ec25ef4b383911\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lp62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2nrch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.770898 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.784335 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q5n88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2311ba56-bb75-4876-ad86-6c74012001ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4dc5ac9aa9c79288c719dacdaec2bcc2b933ab86de98c0a93cf7e0bcbf6fc98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbxp9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q5n88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.805486 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:32Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 13:22:32.559419 7188 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 13:22:32.559434 7188 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 13:22:32.559470 7188 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 13:22:32.559522 7188 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 13:22:32.559531 7188 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 13:22:32.559541 7188 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0309 13:22:32.559551 7188 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 13:22:32.559573 7188 factory.go:656] Stopping watch factory\\\\nI0309 13:22:32.559586 7188 ovnkube.go:599] Stopped ovnkube\\\\nI0309 13:22:32.559612 7188 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 13:22:32.559625 7188 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 13:22:32.559630 7188 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 13:22:32.559637 7188 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 13:22:32.559642 7188 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 13:22:32.559647 7188 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0309 13:22:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:22:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9psl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9khwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.818805 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"967e7a44-ac71-42aa-9847-37799ff35cc0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g2c4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jlgk5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: E0309 13:22:34.820130 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.834824 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740f23ba-2990-4e4a-8ea9-f6c8e8e68c22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5b7237bc4b4d4036a6f66dc705d398087158e42536fa4c5f5d1aa7c9c02d95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a105bcef254c34ab707fc8dbc94df273ebda59e4d376c4598b2c7ba47458e6c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:20:37Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 13:20:06.706547 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 13:20:06.709297 1 observer_polling.go:159] Starting file observer\\\\nI0309 13:20:06.744041 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 13:20:06.749709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 13:20:37.146329 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 13:20:37.146449 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c1230ee4792811f205c69f3e4db77cf395897c92e08634a9cd77c5bb17c6e88\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f1ee03239da8fa7cf5565ff3c6bba6a9d71db4cbcd64f08db0f0be0d8b6134a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.847002 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25eeebd3-86e9-4166-a909-2ad652d90eed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://518722d55747b0df3c6bdb05215d5c8439cef0e2d505a5effb510465273eb0dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://520d7468346bb67f0dc9bfd9c50a059a61cb96f7e43fa87dc52885b3c648a414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5de22dff9567e8ca8196c8e42d84edee93c6034417605502b8af248bf17d1f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af0755eb3c8babed1982da5b567e39a126e43fa5b9c46a5f5b6a6d4b823f599a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.865385 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c84ae6df-f843-449e-ad22-4830d7717a3e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a825114fe508b2472f43f50d8db076ee5aa90a92c80655652a3eb58bdeef394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64e88037df4c29213b17244d0ba0f94d22364d553c9a77307f605ef4fc92574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38f47387bd38080f0c71d138379a144f291da9851bb812095ed4326356051bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2ab4dfcd62adc0b2a0888eebf9c7ee2fc71425ce22ace03cb55f89fed1a8d8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b362c55b9303dcafef2e8c8bc8e190883a56bbfa2e462a2cdaad4f61e1885ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84d684835dd0740a4a20b7ad3e161360927d8678f69d8395fb022ec715606bbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe8e510d642e47186b26d84b9be5b0f6b65717dd9fe72f1daaab2463d7a639c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15691db21340ab64dcd0f249c65cc1d51e3946e16903b744b1114b77a1111a8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.878071 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c533cd6b-0f98-4b0f-b515-cced796ce36f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T13:21:19Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 13:21:19.437050 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 13:21:19.437227 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 13:21:19.438247 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1188282274/tls.crt::/tmp/serving-cert-1188282274/tls.key\\\\\\\"\\\\nI0309 13:21:19.730470 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 13:21:19.731996 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 13:21:19.732014 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 13:21:19.732033 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 13:21:19.732038 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 13:21:19.736367 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 13:21:19.736397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736401 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 13:21:19.736405 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 13:21:19.736408 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 13:21:19.736411 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 13:21:19.736413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 13:21:19.736419 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 13:21:19.738593 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.887056 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"26defdc0-528c-4be2-862b-344697728add\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:20:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d4a4653d7c5c131cab9a6499e04a3c997745594477b4ccaf92226861fbffd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:20:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee2fa039bf38022cabd86d6ff0e9d3708a13878f77ffe7ed2f19c9dd0e9702ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:20:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:20:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.896450 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da06196d99c278f7b49a0eed1bf5cf9d17578bd5121c3ee8b34386e784433d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ac162536a73522b49277b433d449bfbe8281568fa50f7d5cbbb102c72649e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.903804 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2gp76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"186829e9-3995-46ff-807c-06bf53e9a4e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://065c82d1e90728104fda794d98cf44400691ee7faed0086e20559fe25ff998ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28tts\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2gp76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.914390 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rwff8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4fbab78b-1484-4244-8d11-ec4f47b43718\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a25a7f8e129b1573ad893f7d4b7b293cd29a79085dfde8cb188f28c06db1d6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c38e41182e09934b5a163dcff31e38e8995363f11527fa418f39b23cdd8059f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f030f337b376f1c98cacdc129cfeb25bdb5549531b5ddb095f652207acbd37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0079733b4d09a55b89fed10f99ca489a428356f14f1dcbfed20a089870ffce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b6bfcb3df8819f51393efd8eb6462e79c1d6b1483396677f0a8c83da403af63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75289bb4974f7a3a0a74091eff93c34b6904ed253555e1e9fdbced7ab36aa96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc3065b6d9138a4e88152c06da3658263d2283b76e7df43d6012cb07f16472c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T13:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tkddz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rwff8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.926895 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n9x5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d59f2278-9dbc-48bb-8d56-fa9da4183118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T13:22:26Z\\\",\\\"message\\\":\\\"2026-03-09T13:21:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9\\\\n2026-03-09T13:21:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6844f851-1a25-40a2-9260-012a820fb5a9 to /host/opt/cni/bin/\\\\n2026-03-09T13:21:41Z [verbose] multus-daemon started\\\\n2026-03-09T13:21:41Z [verbose] Readiness Indicator file check\\\\n2026-03-09T13:22:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:22:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swqtr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n9x5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.938278 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57c4a00b6ec6cb064bf44b879ea7bac81d72bfd40b281f9f620cd992f9c7c5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.951944 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cd2f672cf07cba374e3a4121c7f2995fb7df46ea3dade650a30d575fa4c662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.964106 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:34 crc kubenswrapper[4703]: I0309 13:22:34.976265 4703 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T13:21:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7882337a4e4e81376135a9039edad30c63a203c69df372b85349e99012634251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T13:21:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4z2rc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T13:21:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-pmzvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:34Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:35 crc kubenswrapper[4703]: I0309 13:22:35.706769 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:35 crc kubenswrapper[4703]: I0309 13:22:35.706836 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:35 crc kubenswrapper[4703]: I0309 13:22:35.706898 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:35 crc kubenswrapper[4703]: E0309 13:22:35.707027 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:35 crc kubenswrapper[4703]: E0309 13:22:35.707126 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:35 crc kubenswrapper[4703]: E0309 13:22:35.707325 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:36 crc kubenswrapper[4703]: I0309 13:22:36.706567 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:36 crc kubenswrapper[4703]: E0309 13:22:36.707219 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.706654 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.706705 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.706728 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:37 crc kubenswrapper[4703]: E0309 13:22:37.706871 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:37 crc kubenswrapper[4703]: E0309 13:22:37.707022 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:37 crc kubenswrapper[4703]: E0309 13:22:37.707247 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.998038 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.998114 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.998135 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.998163 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:37 crc kubenswrapper[4703]: I0309 13:22:37.998182 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:37Z","lastTransitionTime":"2026-03-09T13:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.019764 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.024312 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.024366 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.024379 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.024398 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.024408 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.045466 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.050693 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.050752 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.050769 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.050796 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.050816 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.072279 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.078377 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.078462 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.078481 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.078535 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.078553 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.100102 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.109281 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.109370 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.109462 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.109494 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.109550 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:38Z","lastTransitionTime":"2026-03-09T13:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.134202 4703 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T13:22:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c2e7bdb7-d3ed-4fec-a29a-8e161e5baf7d\\\",\\\"systemUUID\\\":\\\"d1c69174-e64f-4790-ab29-a0802c299c7e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T13:22:38Z is after 2025-08-24T17:21:41Z" Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.134432 4703 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.158106 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.158266 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.158232205 +0000 UTC m=+218.125647931 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.259758 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.259834 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.259936 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.259999 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260011 4703 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260080 4703 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260096 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260145 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260170 4703 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260107 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.260079701 +0000 UTC m=+218.227495427 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260206 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260259 4703 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260280 4703 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260343 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.260242036 +0000 UTC m=+218.227657762 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260410 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.26038619 +0000 UTC m=+218.227801996 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.260451 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.260438232 +0000 UTC m=+218.227853948 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 13:22:38 crc kubenswrapper[4703]: I0309 13:22:38.706325 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:38 crc kubenswrapper[4703]: E0309 13:22:38.706559 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:39 crc kubenswrapper[4703]: I0309 13:22:39.706559 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:39 crc kubenswrapper[4703]: I0309 13:22:39.706711 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:39 crc kubenswrapper[4703]: I0309 13:22:39.706919 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:39 crc kubenswrapper[4703]: E0309 13:22:39.706923 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:39 crc kubenswrapper[4703]: E0309 13:22:39.707088 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:39 crc kubenswrapper[4703]: E0309 13:22:39.707193 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:39 crc kubenswrapper[4703]: E0309 13:22:39.822250 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:40 crc kubenswrapper[4703]: I0309 13:22:40.707303 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:40 crc kubenswrapper[4703]: E0309 13:22:40.707550 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:41 crc kubenswrapper[4703]: I0309 13:22:41.706967 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:41 crc kubenswrapper[4703]: I0309 13:22:41.706967 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:41 crc kubenswrapper[4703]: E0309 13:22:41.707225 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:41 crc kubenswrapper[4703]: I0309 13:22:41.706983 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:41 crc kubenswrapper[4703]: E0309 13:22:41.707311 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:41 crc kubenswrapper[4703]: E0309 13:22:41.707483 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:42 crc kubenswrapper[4703]: I0309 13:22:42.706396 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:42 crc kubenswrapper[4703]: E0309 13:22:42.706676 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:43 crc kubenswrapper[4703]: I0309 13:22:43.706778 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:43 crc kubenswrapper[4703]: I0309 13:22:43.706778 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:43 crc kubenswrapper[4703]: I0309 13:22:43.706834 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:43 crc kubenswrapper[4703]: E0309 13:22:43.707286 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:43 crc kubenswrapper[4703]: E0309 13:22:43.707626 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:43 crc kubenswrapper[4703]: E0309 13:22:43.707997 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.706752 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:44 crc kubenswrapper[4703]: E0309 13:22:44.707042 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.816909 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podStartSLOduration=108.816881636 podStartE2EDuration="1m48.816881636s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.787638503 +0000 UTC m=+160.755054229" watchObservedRunningTime="2026-03-09 13:22:44.816881636 +0000 UTC m=+160.784297352" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.817131 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rwff8" podStartSLOduration=108.817101033 podStartE2EDuration="1m48.817101033s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.816618868 +0000 UTC m=+160.784034614" watchObservedRunningTime="2026-03-09 13:22:44.817101033 +0000 UTC m=+160.784516749" Mar 09 13:22:44 crc kubenswrapper[4703]: E0309 13:22:44.822910 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.844452 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n9x5k" podStartSLOduration=108.844432739 podStartE2EDuration="1m48.844432739s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.844357597 +0000 UTC m=+160.811773353" watchObservedRunningTime="2026-03-09 13:22:44.844432739 +0000 UTC m=+160.811848425" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.887190 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2nrch" podStartSLOduration=107.887160571 podStartE2EDuration="1m47.887160571s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.875433995 +0000 UTC m=+160.842849681" watchObservedRunningTime="2026-03-09 13:22:44.887160571 +0000 UTC m=+160.854576277" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.901206 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.901189625 podStartE2EDuration="41.901189625s" podCreationTimestamp="2026-03-09 13:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.900307449 +0000 UTC m=+160.867723135" watchObservedRunningTime="2026-03-09 13:22:44.901189625 +0000 UTC m=+160.868605311" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.919886 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.919863156 podStartE2EDuration="1m4.919863156s" podCreationTimestamp="2026-03-09 13:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.919143785 +0000 UTC m=+160.886559491" watchObservedRunningTime="2026-03-09 13:22:44.919863156 +0000 UTC m=+160.887278842" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.933756 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=63.933739156 podStartE2EDuration="1m3.933739156s" podCreationTimestamp="2026-03-09 13:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.933426897 +0000 UTC m=+160.900842593" watchObservedRunningTime="2026-03-09 13:22:44.933739156 +0000 UTC m=+160.901154862" Mar 09 13:22:44 crc kubenswrapper[4703]: I0309 13:22:44.956559 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q5n88" podStartSLOduration=108.956538179 podStartE2EDuration="1m48.956538179s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:44.955699694 +0000 UTC m=+160.923115380" watchObservedRunningTime="2026-03-09 13:22:44.956538179 +0000 UTC m=+160.923953865" Mar 09 13:22:45 crc kubenswrapper[4703]: I0309 13:22:45.040738 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=44.040718334 podStartE2EDuration="44.040718334s" podCreationTimestamp="2026-03-09 13:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:45.040354053 +0000 UTC m=+161.007769759" watchObservedRunningTime="2026-03-09 13:22:45.040718334 +0000 UTC m=+161.008134020" Mar 09 13:22:45 crc kubenswrapper[4703]: I0309 13:22:45.066518 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2gp76" podStartSLOduration=109.066492065 podStartE2EDuration="1m49.066492065s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:45.066244767 +0000 UTC m=+161.033660453" watchObservedRunningTime="2026-03-09 13:22:45.066492065 +0000 UTC m=+161.033907801" Mar 09 13:22:45 crc kubenswrapper[4703]: I0309 13:22:45.076877 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=70.07682392 podStartE2EDuration="1m10.07682392s" podCreationTimestamp="2026-03-09 13:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:45.076118979 +0000 UTC m=+161.043534685" watchObservedRunningTime="2026-03-09 13:22:45.07682392 +0000 UTC m=+161.044239606" Mar 09 13:22:45 crc kubenswrapper[4703]: I0309 13:22:45.706160 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:45 crc kubenswrapper[4703]: I0309 13:22:45.706224 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:45 crc kubenswrapper[4703]: I0309 13:22:45.706176 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:45 crc kubenswrapper[4703]: E0309 13:22:45.706371 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:45 crc kubenswrapper[4703]: E0309 13:22:45.706690 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:45 crc kubenswrapper[4703]: E0309 13:22:45.706579 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:46 crc kubenswrapper[4703]: I0309 13:22:46.706610 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:46 crc kubenswrapper[4703]: E0309 13:22:46.706867 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:47 crc kubenswrapper[4703]: I0309 13:22:47.706719 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:47 crc kubenswrapper[4703]: I0309 13:22:47.706780 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:47 crc kubenswrapper[4703]: I0309 13:22:47.706896 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:47 crc kubenswrapper[4703]: E0309 13:22:47.706958 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:47 crc kubenswrapper[4703]: E0309 13:22:47.707117 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:47 crc kubenswrapper[4703]: E0309 13:22:47.707365 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.376672 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.376739 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.376761 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.376789 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.376814 4703 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T13:22:48Z","lastTransitionTime":"2026-03-09T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.446177 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk"] Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.447480 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.450154 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.450543 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.452045 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.452352 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.520387 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.530174 4703 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.585267 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02a87016-8f06-46ab-b12a-734fdd387cdf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.585384 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/02a87016-8f06-46ab-b12a-734fdd387cdf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.585420 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02a87016-8f06-46ab-b12a-734fdd387cdf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.585564 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/02a87016-8f06-46ab-b12a-734fdd387cdf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.585619 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02a87016-8f06-46ab-b12a-734fdd387cdf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.686274 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/02a87016-8f06-46ab-b12a-734fdd387cdf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.686333 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02a87016-8f06-46ab-b12a-734fdd387cdf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.686366 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02a87016-8f06-46ab-b12a-734fdd387cdf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.686417 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/02a87016-8f06-46ab-b12a-734fdd387cdf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.686461 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02a87016-8f06-46ab-b12a-734fdd387cdf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.686481 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/02a87016-8f06-46ab-b12a-734fdd387cdf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.686589 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/02a87016-8f06-46ab-b12a-734fdd387cdf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.688266 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02a87016-8f06-46ab-b12a-734fdd387cdf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.701045 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02a87016-8f06-46ab-b12a-734fdd387cdf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.706432 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:48 crc kubenswrapper[4703]: E0309 13:22:48.706649 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.707672 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:22:48 crc kubenswrapper[4703]: E0309 13:22:48.707954 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.716512 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02a87016-8f06-46ab-b12a-734fdd387cdf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzzjk\" (UID: \"02a87016-8f06-46ab-b12a-734fdd387cdf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.768145 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" Mar 09 13:22:48 crc kubenswrapper[4703]: W0309 13:22:48.790753 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a87016_8f06_46ab_b12a_734fdd387cdf.slice/crio-180c54306eb9106c3aafd3fcb7f40fc6ba087083b0b80412ab33187abb784d31 WatchSource:0}: Error finding container 180c54306eb9106c3aafd3fcb7f40fc6ba087083b0b80412ab33187abb784d31: Status 404 returned error can't find the container with id 180c54306eb9106c3aafd3fcb7f40fc6ba087083b0b80412ab33187abb784d31 Mar 09 13:22:48 crc kubenswrapper[4703]: I0309 13:22:48.921787 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" event={"ID":"02a87016-8f06-46ab-b12a-734fdd387cdf","Type":"ContainerStarted","Data":"180c54306eb9106c3aafd3fcb7f40fc6ba087083b0b80412ab33187abb784d31"} Mar 09 13:22:49 crc kubenswrapper[4703]: I0309 13:22:49.705902 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:49 crc kubenswrapper[4703]: I0309 13:22:49.705968 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:49 crc kubenswrapper[4703]: E0309 13:22:49.706051 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:49 crc kubenswrapper[4703]: I0309 13:22:49.705974 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:49 crc kubenswrapper[4703]: E0309 13:22:49.706132 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:49 crc kubenswrapper[4703]: E0309 13:22:49.706380 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:49 crc kubenswrapper[4703]: E0309 13:22:49.823994 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:49 crc kubenswrapper[4703]: I0309 13:22:49.929627 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" event={"ID":"02a87016-8f06-46ab-b12a-734fdd387cdf","Type":"ContainerStarted","Data":"277ff0700b18d1f3fe34dcb58ff77082b9fdf96f20aaaaf107333733242b47e5"} Mar 09 13:22:50 crc kubenswrapper[4703]: I0309 13:22:50.706900 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:50 crc kubenswrapper[4703]: E0309 13:22:50.707571 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:51 crc kubenswrapper[4703]: I0309 13:22:51.706820 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:51 crc kubenswrapper[4703]: I0309 13:22:51.706954 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:51 crc kubenswrapper[4703]: I0309 13:22:51.706951 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:51 crc kubenswrapper[4703]: E0309 13:22:51.707113 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:51 crc kubenswrapper[4703]: E0309 13:22:51.707276 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:51 crc kubenswrapper[4703]: E0309 13:22:51.707358 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:52 crc kubenswrapper[4703]: I0309 13:22:52.706506 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:52 crc kubenswrapper[4703]: E0309 13:22:52.706947 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:53 crc kubenswrapper[4703]: I0309 13:22:53.705896 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:53 crc kubenswrapper[4703]: I0309 13:22:53.705961 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:53 crc kubenswrapper[4703]: I0309 13:22:53.705984 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:53 crc kubenswrapper[4703]: E0309 13:22:53.706074 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:53 crc kubenswrapper[4703]: E0309 13:22:53.706150 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:53 crc kubenswrapper[4703]: E0309 13:22:53.706307 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:54 crc kubenswrapper[4703]: I0309 13:22:54.706336 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:54 crc kubenswrapper[4703]: E0309 13:22:54.708460 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:54 crc kubenswrapper[4703]: E0309 13:22:54.824745 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:22:55 crc kubenswrapper[4703]: I0309 13:22:55.706737 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:55 crc kubenswrapper[4703]: I0309 13:22:55.706737 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:55 crc kubenswrapper[4703]: E0309 13:22:55.706878 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:55 crc kubenswrapper[4703]: I0309 13:22:55.706946 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:55 crc kubenswrapper[4703]: E0309 13:22:55.707123 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:55 crc kubenswrapper[4703]: E0309 13:22:55.707245 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:56 crc kubenswrapper[4703]: E0309 13:22:56.680935 4703 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:56 crc kubenswrapper[4703]: E0309 13:22:56.681054 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs podName:967e7a44-ac71-42aa-9847-37799ff35cc0 nodeName:}" failed. No retries permitted until 2026-03-09 13:24:00.681021046 +0000 UTC m=+236.648436772 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs") pod "network-metrics-daemon-jlgk5" (UID: "967e7a44-ac71-42aa-9847-37799ff35cc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 13:22:56 crc kubenswrapper[4703]: I0309 13:22:56.681340 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:56 crc kubenswrapper[4703]: I0309 13:22:56.705902 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:56 crc kubenswrapper[4703]: E0309 13:22:56.706058 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:57 crc kubenswrapper[4703]: I0309 13:22:57.706080 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:57 crc kubenswrapper[4703]: I0309 13:22:57.706150 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:57 crc kubenswrapper[4703]: E0309 13:22:57.706269 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:57 crc kubenswrapper[4703]: I0309 13:22:57.706091 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:57 crc kubenswrapper[4703]: E0309 13:22:57.706499 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:57 crc kubenswrapper[4703]: E0309 13:22:57.706773 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:58 crc kubenswrapper[4703]: I0309 13:22:58.706484 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:22:58 crc kubenswrapper[4703]: E0309 13:22:58.706668 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:22:59 crc kubenswrapper[4703]: I0309 13:22:59.706390 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:22:59 crc kubenswrapper[4703]: I0309 13:22:59.706425 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:22:59 crc kubenswrapper[4703]: I0309 13:22:59.706520 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:22:59 crc kubenswrapper[4703]: E0309 13:22:59.706640 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:22:59 crc kubenswrapper[4703]: E0309 13:22:59.706826 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:22:59 crc kubenswrapper[4703]: E0309 13:22:59.707602 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:22:59 crc kubenswrapper[4703]: I0309 13:22:59.708022 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:22:59 crc kubenswrapper[4703]: E0309 13:22:59.708252 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9khwq_openshift-ovn-kubernetes(650f98b2-73a9-4c73-b0cf-70d3bdd61edd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" Mar 09 13:22:59 crc kubenswrapper[4703]: E0309 13:22:59.826708 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:00 crc kubenswrapper[4703]: I0309 13:23:00.707077 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:00 crc kubenswrapper[4703]: E0309 13:23:00.707291 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:01 crc kubenswrapper[4703]: I0309 13:23:01.706464 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:01 crc kubenswrapper[4703]: I0309 13:23:01.706530 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:01 crc kubenswrapper[4703]: I0309 13:23:01.706526 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:01 crc kubenswrapper[4703]: E0309 13:23:01.706648 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:01 crc kubenswrapper[4703]: E0309 13:23:01.706876 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:01 crc kubenswrapper[4703]: E0309 13:23:01.707069 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:02 crc kubenswrapper[4703]: I0309 13:23:02.706381 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:02 crc kubenswrapper[4703]: E0309 13:23:02.706927 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:03 crc kubenswrapper[4703]: I0309 13:23:03.706688 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:03 crc kubenswrapper[4703]: I0309 13:23:03.706761 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:03 crc kubenswrapper[4703]: I0309 13:23:03.706765 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:03 crc kubenswrapper[4703]: E0309 13:23:03.706896 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:03 crc kubenswrapper[4703]: E0309 13:23:03.706953 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:03 crc kubenswrapper[4703]: E0309 13:23:03.707029 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:04 crc kubenswrapper[4703]: I0309 13:23:04.708428 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:04 crc kubenswrapper[4703]: E0309 13:23:04.708625 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:04 crc kubenswrapper[4703]: E0309 13:23:04.827945 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:05 crc kubenswrapper[4703]: I0309 13:23:05.706429 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:05 crc kubenswrapper[4703]: I0309 13:23:05.706492 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:05 crc kubenswrapper[4703]: E0309 13:23:05.706948 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:05 crc kubenswrapper[4703]: I0309 13:23:05.707045 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:05 crc kubenswrapper[4703]: E0309 13:23:05.707152 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:05 crc kubenswrapper[4703]: E0309 13:23:05.707292 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:06 crc kubenswrapper[4703]: I0309 13:23:06.706865 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:06 crc kubenswrapper[4703]: E0309 13:23:06.707006 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:07 crc kubenswrapper[4703]: I0309 13:23:07.706382 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:07 crc kubenswrapper[4703]: I0309 13:23:07.706488 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:07 crc kubenswrapper[4703]: I0309 13:23:07.706401 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:07 crc kubenswrapper[4703]: E0309 13:23:07.706580 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:07 crc kubenswrapper[4703]: E0309 13:23:07.706765 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:07 crc kubenswrapper[4703]: E0309 13:23:07.706975 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:08 crc kubenswrapper[4703]: I0309 13:23:08.706782 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:08 crc kubenswrapper[4703]: E0309 13:23:08.707013 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:09 crc kubenswrapper[4703]: I0309 13:23:09.706255 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:09 crc kubenswrapper[4703]: I0309 13:23:09.706323 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:09 crc kubenswrapper[4703]: E0309 13:23:09.706449 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:09 crc kubenswrapper[4703]: I0309 13:23:09.706346 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:09 crc kubenswrapper[4703]: E0309 13:23:09.706588 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:09 crc kubenswrapper[4703]: E0309 13:23:09.706787 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:09 crc kubenswrapper[4703]: E0309 13:23:09.829159 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:10 crc kubenswrapper[4703]: I0309 13:23:10.706327 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:10 crc kubenswrapper[4703]: E0309 13:23:10.706676 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:11 crc kubenswrapper[4703]: I0309 13:23:11.706603 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:11 crc kubenswrapper[4703]: E0309 13:23:11.706758 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:11 crc kubenswrapper[4703]: I0309 13:23:11.706612 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:11 crc kubenswrapper[4703]: I0309 13:23:11.706896 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:11 crc kubenswrapper[4703]: E0309 13:23:11.706996 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:11 crc kubenswrapper[4703]: E0309 13:23:11.707128 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:12 crc kubenswrapper[4703]: I0309 13:23:12.706892 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:12 crc kubenswrapper[4703]: E0309 13:23:12.707009 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.013479 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/1.log" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.014141 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/0.log" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.014222 4703 generic.go:334] "Generic (PLEG): container finished" podID="d59f2278-9dbc-48bb-8d56-fa9da4183118" containerID="d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f" exitCode=1 Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.014292 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerDied","Data":"d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f"} Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.014359 4703 scope.go:117] "RemoveContainer" containerID="6822059f46bdcac7f0fe662aadd907b8b1434e0a4bf15e8549235a8271f65526" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.014886 4703 scope.go:117] "RemoveContainer" containerID="d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f" Mar 09 13:23:13 crc kubenswrapper[4703]: E0309 13:23:13.015118 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n9x5k_openshift-multus(d59f2278-9dbc-48bb-8d56-fa9da4183118)\"" pod="openshift-multus/multus-n9x5k" podUID="d59f2278-9dbc-48bb-8d56-fa9da4183118" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.054950 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzzjk" podStartSLOduration=137.054926934 podStartE2EDuration="2m17.054926934s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:22:49.952983548 +0000 UTC m=+165.920399274" watchObservedRunningTime="2026-03-09 13:23:13.054926934 +0000 UTC m=+189.022342660" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.706445 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.706482 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.706558 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:13 crc kubenswrapper[4703]: E0309 13:23:13.706821 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:13 crc kubenswrapper[4703]: E0309 13:23:13.706996 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:13 crc kubenswrapper[4703]: E0309 13:23:13.707154 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:13 crc kubenswrapper[4703]: I0309 13:23:13.708399 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.021126 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/3.log" Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.024788 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerStarted","Data":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.025409 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.026775 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/1.log" Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.061125 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podStartSLOduration=137.061090675 podStartE2EDuration="2m17.061090675s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:14.058895561 +0000 UTC m=+190.026311267" watchObservedRunningTime="2026-03-09 13:23:14.061090675 +0000 UTC m=+190.028506401" Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.485621 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jlgk5"] Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.485748 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:14 crc kubenswrapper[4703]: E0309 13:23:14.485864 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:14 crc kubenswrapper[4703]: I0309 13:23:14.707209 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:14 crc kubenswrapper[4703]: E0309 13:23:14.707296 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:14 crc kubenswrapper[4703]: E0309 13:23:14.829603 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:15 crc kubenswrapper[4703]: I0309 13:23:15.706344 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:15 crc kubenswrapper[4703]: I0309 13:23:15.706419 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:15 crc kubenswrapper[4703]: E0309 13:23:15.706509 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:15 crc kubenswrapper[4703]: E0309 13:23:15.706618 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:16 crc kubenswrapper[4703]: I0309 13:23:16.706724 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:16 crc kubenswrapper[4703]: E0309 13:23:16.706943 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:16 crc kubenswrapper[4703]: I0309 13:23:16.707312 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:16 crc kubenswrapper[4703]: E0309 13:23:16.707703 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:17 crc kubenswrapper[4703]: I0309 13:23:17.706346 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:17 crc kubenswrapper[4703]: E0309 13:23:17.706509 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:17 crc kubenswrapper[4703]: I0309 13:23:17.706346 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:17 crc kubenswrapper[4703]: E0309 13:23:17.706821 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:18 crc kubenswrapper[4703]: I0309 13:23:18.706577 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:18 crc kubenswrapper[4703]: I0309 13:23:18.706678 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:18 crc kubenswrapper[4703]: E0309 13:23:18.706731 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:18 crc kubenswrapper[4703]: E0309 13:23:18.706919 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:19 crc kubenswrapper[4703]: I0309 13:23:19.706577 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:19 crc kubenswrapper[4703]: I0309 13:23:19.706597 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:19 crc kubenswrapper[4703]: E0309 13:23:19.706918 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:19 crc kubenswrapper[4703]: E0309 13:23:19.707040 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:19 crc kubenswrapper[4703]: E0309 13:23:19.830908 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:20 crc kubenswrapper[4703]: I0309 13:23:20.706387 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:20 crc kubenswrapper[4703]: I0309 13:23:20.706390 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:20 crc kubenswrapper[4703]: E0309 13:23:20.706747 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:20 crc kubenswrapper[4703]: E0309 13:23:20.706608 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:21 crc kubenswrapper[4703]: I0309 13:23:21.706329 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:21 crc kubenswrapper[4703]: I0309 13:23:21.706329 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:21 crc kubenswrapper[4703]: E0309 13:23:21.706561 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:21 crc kubenswrapper[4703]: E0309 13:23:21.706674 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:22 crc kubenswrapper[4703]: I0309 13:23:22.706885 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:22 crc kubenswrapper[4703]: I0309 13:23:22.707005 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:22 crc kubenswrapper[4703]: E0309 13:23:22.707098 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:22 crc kubenswrapper[4703]: E0309 13:23:22.707190 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:23 crc kubenswrapper[4703]: I0309 13:23:23.706916 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:23 crc kubenswrapper[4703]: E0309 13:23:23.707163 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:23 crc kubenswrapper[4703]: I0309 13:23:23.707295 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:23 crc kubenswrapper[4703]: E0309 13:23:23.707465 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:24 crc kubenswrapper[4703]: I0309 13:23:24.707012 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:24 crc kubenswrapper[4703]: I0309 13:23:24.707009 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:24 crc kubenswrapper[4703]: I0309 13:23:24.708462 4703 scope.go:117] "RemoveContainer" containerID="d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f" Mar 09 13:23:24 crc kubenswrapper[4703]: E0309 13:23:24.708639 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:24 crc kubenswrapper[4703]: E0309 13:23:24.708731 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:24 crc kubenswrapper[4703]: E0309 13:23:24.831668 4703 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 13:23:25 crc kubenswrapper[4703]: I0309 13:23:25.067062 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/1.log" Mar 09 13:23:25 crc kubenswrapper[4703]: I0309 13:23:25.067126 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerStarted","Data":"668964c01b4d2908da9803589c5ad4a6e403b5be0fd29ed46afa2fba2d4dd26c"} Mar 09 13:23:25 crc kubenswrapper[4703]: I0309 13:23:25.706174 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:25 crc kubenswrapper[4703]: E0309 13:23:25.706291 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:25 crc kubenswrapper[4703]: I0309 13:23:25.706349 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:25 crc kubenswrapper[4703]: E0309 13:23:25.706547 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:26 crc kubenswrapper[4703]: I0309 13:23:26.706313 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:26 crc kubenswrapper[4703]: E0309 13:23:26.706550 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:26 crc kubenswrapper[4703]: I0309 13:23:26.706932 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:26 crc kubenswrapper[4703]: E0309 13:23:26.707064 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:27 crc kubenswrapper[4703]: I0309 13:23:27.706795 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:27 crc kubenswrapper[4703]: I0309 13:23:27.706814 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:27 crc kubenswrapper[4703]: E0309 13:23:27.708217 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:27 crc kubenswrapper[4703]: E0309 13:23:27.708330 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:28 crc kubenswrapper[4703]: I0309 13:23:28.707229 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:28 crc kubenswrapper[4703]: E0309 13:23:28.707418 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jlgk5" podUID="967e7a44-ac71-42aa-9847-37799ff35cc0" Mar 09 13:23:28 crc kubenswrapper[4703]: I0309 13:23:28.707239 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:28 crc kubenswrapper[4703]: E0309 13:23:28.707785 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 13:23:29 crc kubenswrapper[4703]: I0309 13:23:29.706579 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:29 crc kubenswrapper[4703]: I0309 13:23:29.706625 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:29 crc kubenswrapper[4703]: E0309 13:23:29.706719 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 13:23:29 crc kubenswrapper[4703]: E0309 13:23:29.706928 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 13:23:30 crc kubenswrapper[4703]: I0309 13:23:30.706121 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:23:30 crc kubenswrapper[4703]: I0309 13:23:30.706213 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:30 crc kubenswrapper[4703]: I0309 13:23:30.709186 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:23:30 crc kubenswrapper[4703]: I0309 13:23:30.709999 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:23:30 crc kubenswrapper[4703]: I0309 13:23:30.710634 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:23:30 crc kubenswrapper[4703]: I0309 13:23:30.711089 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:23:31 crc kubenswrapper[4703]: I0309 13:23:31.705946 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:31 crc kubenswrapper[4703]: I0309 13:23:31.705985 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:31 crc kubenswrapper[4703]: I0309 13:23:31.708137 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:23:31 crc kubenswrapper[4703]: I0309 13:23:31.708892 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.500444 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.500528 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.549078 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.579320 4703 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.619224 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.619658 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.620721 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tnn9w"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.621187 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.622307 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6qv87"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.622738 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.623413 4703 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv": failed to list *v1.Secret: secrets "openshift-apiserver-operator-dockercfg-xtcjv" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.623453 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-xtcjv\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-operator-dockercfg-xtcjv\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.623593 4703 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert": failed to list *v1.Secret: secrets "openshift-apiserver-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.623620 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.623744 4703 reflector.go:561] object-"openshift-apiserver-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.623771 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.624633 4703 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj": failed to list *v1.Secret: secrets "authentication-operator-dockercfg-mz9bj" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.624665 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-mz9bj\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"authentication-operator-dockercfg-mz9bj\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.624866 4703 reflector.go:561] object-"openshift-authentication-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.624895 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.624910 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.624932 4703 reflector.go:561] object-"openshift-authentication-operator"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.624944 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.625099 4703 reflector.go:561] object-"openshift-authentication-operator"/"service-ca-bundle": failed to list *v1.ConfigMap: configmaps "service-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.625121 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"service-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.625169 4703 reflector.go:561] object-"openshift-authentication-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.625184 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.625556 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/955963d3-5f3c-46c6-bfa0-6473a1238064-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.625660 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f94d\" (UniqueName: \"kubernetes.io/projected/955963d3-5f3c-46c6-bfa0-6473a1238064-kube-api-access-9f94d\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.625750 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/955963d3-5f3c-46c6-bfa0-6473a1238064-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.626587 4703 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.626619 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.626806 4703 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.626834 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.626913 4703 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.626930 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.627015 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.627232 4703 reflector.go:561] object-"openshift-authentication-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.627258 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.629983 4703 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-config": failed to list *v1.ConfigMap: configmaps "authentication-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.630025 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"authentication-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.630097 4703 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.630112 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.630206 4703 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.630226 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: W0309 13:23:39.630276 4703 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 09 13:23:39 crc kubenswrapper[4703]: E0309 13:23:39.630289 4703 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.631174 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.631661 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.636369 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.636579 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.640445 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k8tgg"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.640981 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.641278 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9qw5t"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.641596 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.642353 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.642667 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.645183 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.645732 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.646331 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ghbk9"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.646839 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.647694 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vs8rs"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.648436 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vs8rs" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.648839 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fjq8h"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.649236 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.653765 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.654000 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.654049 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.654476 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.654597 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.655145 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.655545 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.655758 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.656171 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n7xr4"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.656559 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.658394 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.658632 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.658777 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.659006 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.659144 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.659412 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.659639 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.659835 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.659869 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.660065 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.660212 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.660377 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.669231 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.670055 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.670337 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.671105 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jbdq8"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.673719 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.674528 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.679333 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.679366 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.679352 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.679583 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.679735 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.679819 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.679876 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.683510 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5clrh"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.684350 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tnn9w"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.684473 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6qv87"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.684551 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.683616 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680014 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680072 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680236 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680410 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680438 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680472 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680621 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680670 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680681 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680726 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.680891 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.681095 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.681452 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.681510 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.681544 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.685667 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.685763 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688000 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688116 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688227 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688401 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688674 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688900 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688986 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688989 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.688997 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689031 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689043 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.690861 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689056 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689109 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689142 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689162 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689181 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689223 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689263 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689310 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689334 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689403 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689641 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689672 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689722 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.689880 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.690032 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.694042 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.694195 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.694244 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.696297 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nzj56"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.695082 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.699483 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.699718 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.700696 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.700789 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.701142 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.701327 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gnllw"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.701930 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.702495 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.702690 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lddp9"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.703285 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.705050 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.705815 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.706083 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.706794 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.709435 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.713705 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.714672 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.715998 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.730570 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.731596 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.732650 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/955963d3-5f3c-46c6-bfa0-6473a1238064-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.732872 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.733628 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/955963d3-5f3c-46c6-bfa0-6473a1238064-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.733763 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.733781 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-config\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.737089 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e8298d3-3e49-4df3-9369-2623b11981cf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.737271 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fh9g\" (UniqueName: \"kubernetes.io/projected/7e8298d3-3e49-4df3-9369-2623b11981cf-kube-api-access-4fh9g\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.737397 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29wlf\" (UniqueName: \"kubernetes.io/projected/5acad2e3-ce55-43d7-b83f-75adb6b59e71-kube-api-access-29wlf\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.737484 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5acad2e3-ce55-43d7-b83f-75adb6b59e71-serving-cert\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.737565 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-config\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.737645 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-service-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.737730 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/955963d3-5f3c-46c6-bfa0-6473a1238064-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.738054 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f94d\" (UniqueName: \"kubernetes.io/projected/955963d3-5f3c-46c6-bfa0-6473a1238064-kube-api-access-9f94d\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.738201 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-images\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.765731 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.766330 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.768600 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.769285 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.771709 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.772913 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.773345 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.773559 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.773911 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.777905 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ntntn"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.778703 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.778930 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.779511 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.780514 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.781103 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.781672 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.781901 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.783697 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.784319 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.784635 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.784947 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.785786 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.786565 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.787007 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.787795 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.788179 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wzpmb"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.788599 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.788747 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.789699 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.790357 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.791752 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptzrr"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.792495 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.793315 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.797547 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551042-fv4ms"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.798347 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.802622 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.804348 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pjlkd"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805226 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vs8rs"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805243 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k8tgg"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805253 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805261 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9qw5t"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805271 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805320 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805354 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.805753 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jbdq8"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.806960 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5clrh"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.808254 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-fv4ms"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.809337 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ghbk9"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.810452 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.811652 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.824189 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n7xr4"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.825736 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.826978 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.829219 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.838167 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839195 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839242 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839261 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/584ca99c-8678-42f1-8a73-704780debc36-proxy-tls\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839284 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-serving-cert\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839301 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-ca\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839323 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04b27022-1294-45a5-90c4-17d007e9b468-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839338 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wft66\" (UniqueName: \"kubernetes.io/projected/04b27022-1294-45a5-90c4-17d007e9b468-kube-api-access-wft66\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839357 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839378 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-policies\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839392 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-auth-proxy-config\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839412 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-machine-approver-tls\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839426 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839443 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6fd\" (UniqueName: \"kubernetes.io/projected/8748922b-1caa-42f6-a573-9a43c160b26a-kube-api-access-vq6fd\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839457 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839476 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-config\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839494 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-oauth-serving-cert\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839510 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e8298d3-3e49-4df3-9369-2623b11981cf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839526 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fh9g\" (UniqueName: \"kubernetes.io/projected/7e8298d3-3e49-4df3-9369-2623b11981cf-kube-api-access-4fh9g\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839544 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/584ca99c-8678-42f1-8a73-704780debc36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839560 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-oauth-config\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839580 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4228r\" (UniqueName: \"kubernetes.io/projected/26e4ae04-1b0c-4cdf-a086-356bab16766e-kube-api-access-4228r\") pod \"cluster-samples-operator-665b6dd947-4xzsw\" (UID: \"26e4ae04-1b0c-4cdf-a086-356bab16766e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839594 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-client\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839609 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-trusted-ca-bundle\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839623 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml2tc\" (UniqueName: \"kubernetes.io/projected/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-kube-api-access-ml2tc\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839641 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5acad2e3-ce55-43d7-b83f-75adb6b59e71-serving-cert\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839656 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29wlf\" (UniqueName: \"kubernetes.io/projected/5acad2e3-ce55-43d7-b83f-75adb6b59e71-kube-api-access-29wlf\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839673 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-config\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839688 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839705 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-config\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839721 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-trusted-ca\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.839771 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8748922b-1caa-42f6-a573-9a43c160b26a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840039 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-config\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840092 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-config\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840128 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czggb\" (UniqueName: \"kubernetes.io/projected/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-kube-api-access-czggb\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840153 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-service-ca\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840226 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-service-ca\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840259 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-config\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840315 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-service-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840356 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26e4ae04-1b0c-4cdf-a086-356bab16766e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4xzsw\" (UID: \"26e4ae04-1b0c-4cdf-a086-356bab16766e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840426 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-config\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840452 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-dir\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840475 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-serving-cert\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840456 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fjq8h"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840526 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdvc\" (UniqueName: \"kubernetes.io/projected/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-kube-api-access-gsdvc\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840547 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zhq4\" (UniqueName: \"kubernetes.io/projected/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-kube-api-access-9zhq4\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840577 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-client-ca\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840598 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840668 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840700 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840729 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840790 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9984b825-d6a3-4756-b2dc-2a240ca82a8d-serving-cert\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840824 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62prj\" (UniqueName: \"kubernetes.io/projected/584ca99c-8678-42f1-8a73-704780debc36-kube-api-access-62prj\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840888 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-images\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840918 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8748922b-1caa-42f6-a573-9a43c160b26a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840949 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqd4t\" (UniqueName: \"kubernetes.io/projected/9984b825-d6a3-4756-b2dc-2a240ca82a8d-kube-api-access-xqd4t\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.840972 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.841000 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.853809 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.856389 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gnllw"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.858695 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.862587 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.869791 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wzpmb"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.869912 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.869926 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ntntn"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.872096 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.873009 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptzrr"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.874520 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.876184 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.877639 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.879375 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vkt4s"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.880079 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.881169 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.881672 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.882428 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-trj95"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.883179 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-trj95" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.884157 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.885514 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.887599 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lddp9"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.889142 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pjlkd"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.890550 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-trj95"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.892228 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vkt4s"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.893657 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.895258 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.896647 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.898456 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-knk5k"] Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.899085 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.901663 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.922140 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942377 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942493 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-oauth-serving-cert\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942546 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/584ca99c-8678-42f1-8a73-704780debc36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942571 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-oauth-config\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942596 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4228r\" (UniqueName: \"kubernetes.io/projected/26e4ae04-1b0c-4cdf-a086-356bab16766e-kube-api-access-4228r\") pod \"cluster-samples-operator-665b6dd947-4xzsw\" (UID: \"26e4ae04-1b0c-4cdf-a086-356bab16766e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942617 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-client\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942640 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-trusted-ca-bundle\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942659 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml2tc\" (UniqueName: \"kubernetes.io/projected/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-kube-api-access-ml2tc\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942700 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-config\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942721 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942744 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-config\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942764 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-trusted-ca\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942785 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8748922b-1caa-42f6-a573-9a43c160b26a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942806 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-config\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942859 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czggb\" (UniqueName: \"kubernetes.io/projected/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-kube-api-access-czggb\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942883 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-service-ca\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942905 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-service-ca\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942930 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-config\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942961 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26e4ae04-1b0c-4cdf-a086-356bab16766e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4xzsw\" (UID: \"26e4ae04-1b0c-4cdf-a086-356bab16766e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.942992 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-config\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943014 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-dir\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943038 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-serving-cert\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943063 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdvc\" (UniqueName: \"kubernetes.io/projected/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-kube-api-access-gsdvc\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943086 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zhq4\" (UniqueName: \"kubernetes.io/projected/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-kube-api-access-9zhq4\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943107 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-client-ca\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943131 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943180 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943207 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943231 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943269 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9984b825-d6a3-4756-b2dc-2a240ca82a8d-serving-cert\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943294 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62prj\" (UniqueName: \"kubernetes.io/projected/584ca99c-8678-42f1-8a73-704780debc36-kube-api-access-62prj\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943325 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8748922b-1caa-42f6-a573-9a43c160b26a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943347 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqd4t\" (UniqueName: \"kubernetes.io/projected/9984b825-d6a3-4756-b2dc-2a240ca82a8d-kube-api-access-xqd4t\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943371 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943405 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943441 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943463 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/584ca99c-8678-42f1-8a73-704780debc36-proxy-tls\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943487 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-serving-cert\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943510 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-ca\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943540 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04b27022-1294-45a5-90c4-17d007e9b468-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943561 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wft66\" (UniqueName: \"kubernetes.io/projected/04b27022-1294-45a5-90c4-17d007e9b468-kube-api-access-wft66\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943578 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943599 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-policies\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943615 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-auth-proxy-config\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943632 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-machine-approver-tls\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943650 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943668 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6fd\" (UniqueName: \"kubernetes.io/projected/8748922b-1caa-42f6-a573-9a43c160b26a-kube-api-access-vq6fd\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943683 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943954 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-oauth-serving-cert\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.944260 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-trusted-ca-bundle\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.944695 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8748922b-1caa-42f6-a573-9a43c160b26a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.945254 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.943488 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/584ca99c-8678-42f1-8a73-704780debc36-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.945362 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.945983 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-dir\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.946086 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-ca\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.946161 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-policies\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.946564 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-auth-proxy-config\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.946620 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-service-ca\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.946822 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-config\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.946824 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-config\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.947125 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-config\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.947131 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-config\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.947396 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.947727 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-client-ca\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.948091 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-service-ca\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.948191 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-trusted-ca\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.948432 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9984b825-d6a3-4756-b2dc-2a240ca82a8d-config\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.949125 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.951115 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.951471 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.951593 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04b27022-1294-45a5-90c4-17d007e9b468-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952006 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952079 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952118 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952165 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-oauth-config\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952278 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-serving-cert\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952364 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-machine-approver-tls\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952368 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9984b825-d6a3-4756-b2dc-2a240ca82a8d-serving-cert\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952468 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8748922b-1caa-42f6-a573-9a43c160b26a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952584 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9984b825-d6a3-4756-b2dc-2a240ca82a8d-etcd-client\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.952704 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.953228 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-console-serving-cert\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.953569 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26e4ae04-1b0c-4cdf-a086-356bab16766e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4xzsw\" (UID: \"26e4ae04-1b0c-4cdf-a086-356bab16766e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.954690 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.961909 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:23:39 crc kubenswrapper[4703]: I0309 13:23:39.981804 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.001986 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.022013 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.041936 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.061803 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.081561 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.102743 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.121271 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.142474 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.162211 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.182640 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.202385 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.223202 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.241578 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.278124 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.282535 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.321968 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.342945 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.363323 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.383035 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.401581 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.421595 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.432097 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/584ca99c-8678-42f1-8a73-704780debc36-proxy-tls\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.443186 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.502276 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.521823 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.563462 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.563515 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.583196 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.602553 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.623396 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.642590 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.662894 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.681906 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.701969 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.723106 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.739655 4703 secret.go:188] Couldn't get secret openshift-apiserver-operator/openshift-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.739721 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/955963d3-5f3c-46c6-bfa0-6473a1238064-serving-cert podName:955963d3-5f3c-46c6-bfa0-6473a1238064 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.239702871 +0000 UTC m=+217.207118557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/955963d3-5f3c-46c6-bfa0-6473a1238064-serving-cert") pod "openshift-apiserver-operator-796bbdcf4f-vh56s" (UID: "955963d3-5f3c-46c6-bfa0-6473a1238064") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.743063 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.763000 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.780167 4703 request.go:700] Waited for 1.000308842s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.781960 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.803434 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.822094 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.840460 4703 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.840475 4703 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.840506 4703 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.840599 4703 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.840604 4703 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841050 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e8298d3-3e49-4df3-9369-2623b11981cf-machine-api-operator-tls podName:7e8298d3-3e49-4df3-9369-2623b11981cf nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.340789601 +0000 UTC m=+217.308205287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/7e8298d3-3e49-4df3-9369-2623b11981cf-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-6qv87" (UID: "7e8298d3-3e49-4df3-9369-2623b11981cf") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841091 4703 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841126 4703 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841159 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-config podName:5acad2e3-ce55-43d7-b83f-75adb6b59e71 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.341144832 +0000 UTC m=+217.308560518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-config") pod "authentication-operator-69f744f599-tnn9w" (UID: "5acad2e3-ce55-43d7-b83f-75adb6b59e71") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841386 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5acad2e3-ce55-43d7-b83f-75adb6b59e71-serving-cert podName:5acad2e3-ce55-43d7-b83f-75adb6b59e71 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.341362518 +0000 UTC m=+217.308778274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5acad2e3-ce55-43d7-b83f-75adb6b59e71-serving-cert") pod "authentication-operator-69f744f599-tnn9w" (UID: "5acad2e3-ce55-43d7-b83f-75adb6b59e71") : failed to sync secret cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841411 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-service-ca-bundle podName:5acad2e3-ce55-43d7-b83f-75adb6b59e71 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.3413983 +0000 UTC m=+217.308814006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-service-ca-bundle") pod "authentication-operator-69f744f599-tnn9w" (UID: "5acad2e3-ce55-43d7-b83f-75adb6b59e71") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841431 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-config podName:7e8298d3-3e49-4df3-9369-2623b11981cf nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.34142143 +0000 UTC m=+217.308837206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-config") pod "machine-api-operator-5694c8668f-6qv87" (UID: "7e8298d3-3e49-4df3-9369-2623b11981cf") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841468 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-images podName:7e8298d3-3e49-4df3-9369-2623b11981cf nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.341456431 +0000 UTC m=+217.308872127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-images") pod "machine-api-operator-5694c8668f-6qv87" (UID: "7e8298d3-3e49-4df3-9369-2623b11981cf") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: E0309 13:23:40.841488 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-trusted-ca-bundle podName:5acad2e3-ce55-43d7-b83f-75adb6b59e71 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.341477292 +0000 UTC m=+217.308892988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-trusted-ca-bundle") pod "authentication-operator-69f744f599-tnn9w" (UID: "5acad2e3-ce55-43d7-b83f-75adb6b59e71") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.842672 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.861933 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.883058 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.902061 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.922469 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.942017 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.963527 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:23:40 crc kubenswrapper[4703]: I0309 13:23:40.983528 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.002757 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.023657 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.043109 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.062266 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.082931 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.103281 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.122077 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.143110 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.163392 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.183017 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.202941 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.222957 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.243429 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.264011 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.271430 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/955963d3-5f3c-46c6-bfa0-6473a1238064-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.282042 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.302494 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.323151 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.343319 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.362778 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.372873 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-images\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.372960 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.373090 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-config\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.373134 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e8298d3-3e49-4df3-9369-2623b11981cf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.373246 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5acad2e3-ce55-43d7-b83f-75adb6b59e71-serving-cert\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.373302 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-config\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.373357 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-service-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.391515 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.402202 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.422736 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.443062 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.463589 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.483215 4703 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:23:41 crc kubenswrapper[4703]: E0309 13:23:41.493469 4703 projected.go:288] Couldn't get configMap openshift-apiserver-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:41 crc kubenswrapper[4703]: E0309 13:23:41.493536 4703 projected.go:194] Error preparing data for projected volume kube-api-access-9f94d for pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s: failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:41 crc kubenswrapper[4703]: E0309 13:23:41.493625 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/955963d3-5f3c-46c6-bfa0-6473a1238064-kube-api-access-9f94d podName:955963d3-5f3c-46c6-bfa0-6473a1238064 nodeName:}" failed. No retries permitted until 2026-03-09 13:23:41.993598379 +0000 UTC m=+217.961014075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9f94d" (UniqueName: "kubernetes.io/projected/955963d3-5f3c-46c6-bfa0-6473a1238064-kube-api-access-9f94d") pod "openshift-apiserver-operator-796bbdcf4f-vh56s" (UID: "955963d3-5f3c-46c6-bfa0-6473a1238064") : failed to sync configmap cache: timed out waiting for the condition Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.501915 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.522155 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.581435 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.601950 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.621715 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.642754 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.662539 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.682963 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.702508 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.722590 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.743330 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.761940 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.780646 4703 request.go:700] Waited for 1.837679607s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.795839 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml2tc\" (UniqueName: \"kubernetes.io/projected/98d3f0fa-d5c6-4288-af8a-bdc0b29dab63-kube-api-access-ml2tc\") pod \"console-f9d7485db-jbdq8\" (UID: \"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63\") " pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.817757 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4228r\" (UniqueName: \"kubernetes.io/projected/26e4ae04-1b0c-4cdf-a086-356bab16766e-kube-api-access-4228r\") pod \"cluster-samples-operator-665b6dd947-4xzsw\" (UID: \"26e4ae04-1b0c-4cdf-a086-356bab16766e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.840039 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wft66\" (UniqueName: \"kubernetes.io/projected/04b27022-1294-45a5-90c4-17d007e9b468-kube-api-access-wft66\") pod \"route-controller-manager-6576b87f9c-tvbdd\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.876589 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6fd\" (UniqueName: \"kubernetes.io/projected/8748922b-1caa-42f6-a573-9a43c160b26a-kube-api-access-vq6fd\") pod \"openshift-controller-manager-operator-756b6f6bc6-qd4mj\" (UID: \"8748922b-1caa-42f6-a573-9a43c160b26a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.881713 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqd4t\" (UniqueName: \"kubernetes.io/projected/9984b825-d6a3-4756-b2dc-2a240ca82a8d-kube-api-access-xqd4t\") pod \"etcd-operator-b45778765-n7xr4\" (UID: \"9984b825-d6a3-4756-b2dc-2a240ca82a8d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.882029 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.896708 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62prj\" (UniqueName: \"kubernetes.io/projected/584ca99c-8678-42f1-8a73-704780debc36-kube-api-access-62prj\") pod \"machine-config-controller-84d6567774-7q8lx\" (UID: \"584ca99c-8678-42f1-8a73-704780debc36\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.919028 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdvc\" (UniqueName: \"kubernetes.io/projected/f9e0f844-c3b0-4411-ba1a-4c57a5e7796e-kube-api-access-gsdvc\") pod \"console-operator-58897d9998-9qw5t\" (UID: \"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e\") " pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.934503 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.936217 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czggb\" (UniqueName: \"kubernetes.io/projected/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-kube-api-access-czggb\") pod \"oauth-openshift-558db77b4-ghbk9\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.953810 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.963639 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.965000 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zhq4\" (UniqueName: \"kubernetes.io/projected/e9d54014-8ba5-4a3a-beb0-3b260f0ef64c-kube-api-access-9zhq4\") pod \"machine-approver-56656f9798-dhqcw\" (UID: \"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:41 crc kubenswrapper[4703]: I0309 13:23:41.979164 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.008212 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.014677 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-config\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.023242 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.043381 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.045637 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.062191 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e8298d3-3e49-4df3-9369-2623b11981cf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.069235 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.075563 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.081808 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-tls\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.081901 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-stats-auth\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.081931 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bc6d0856-76af-4510-a9b8-1cefb8e82e79-profile-collector-cert\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.081966 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-trusted-ca\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.081998 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpdj\" (UniqueName: \"kubernetes.io/projected/c90bc0f7-a19a-46ff-8610-5ee7274faa34-kube-api-access-pvpdj\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082019 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-client-ca\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082038 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wm2c\" (UniqueName: \"kubernetes.io/projected/f9fc9149-f3ff-430c-9ea7-b6050cc19222-kube-api-access-4wm2c\") pod \"dns-operator-744455d44c-5clrh\" (UID: \"f9fc9149-f3ff-430c-9ea7-b6050cc19222\") " pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082188 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082442 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-serving-cert\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082469 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9460c02-0c7c-4f14-9950-407330c1f960-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082493 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082515 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6068d6-17d2-4802-8778-e1ab076da652-service-ca-bundle\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082620 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-serving-cert\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082690 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f94d\" (UniqueName: \"kubernetes.io/projected/955963d3-5f3c-46c6-bfa0-6473a1238064-kube-api-access-9f94d\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082724 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/271141cc-078d-419f-b9c5-d7ef6263442d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082748 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-etcd-client\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082769 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b216ba2d-b6d1-4363-b54b-57f43f575c33-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082790 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mldw9\" (UniqueName: \"kubernetes.io/projected/271141cc-078d-419f-b9c5-d7ef6263442d-kube-api-access-mldw9\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082805 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gpmc\" (UniqueName: \"kubernetes.io/projected/b216ba2d-b6d1-4363-b54b-57f43f575c33-kube-api-access-9gpmc\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082829 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9fc9149-f3ff-430c-9ea7-b6050cc19222-metrics-tls\") pod \"dns-operator-744455d44c-5clrh\" (UID: \"f9fc9149-f3ff-430c-9ea7-b6050cc19222\") " pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082856 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082872 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-config\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082886 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-default-certificate\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082914 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082932 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/507f368b-bd66-4e0a-8e30-bc8505f3f76f-audit-dir\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082949 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9460c02-0c7c-4f14-9950-407330c1f960-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082968 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b216ba2d-b6d1-4363-b54b-57f43f575c33-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082997 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/271141cc-078d-419f-b9c5-d7ef6263442d-serving-cert\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083025 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083041 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcjjp\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-kube-api-access-kcjjp\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083055 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-config\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083071 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9460c02-0c7c-4f14-9950-407330c1f960-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083090 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-encryption-config\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083109 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-etcd-serving-ca\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083133 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-encryption-config\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083155 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-audit\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083172 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083185 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/507f368b-bd66-4e0a-8e30-bc8505f3f76f-node-pullsecrets\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083203 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083221 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cldjt\" (UniqueName: \"kubernetes.io/projected/2b4dabbf-243f-4504-abc1-b34b4da6a25c-kube-api-access-cldjt\") pod \"downloads-7954f5f757-vs8rs\" (UID: \"2b4dabbf-243f-4504-abc1-b34b4da6a25c\") " pod="openshift-console/downloads-7954f5f757-vs8rs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083247 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-image-import-ca\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083262 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-audit-policies\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083278 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9m6f\" (UniqueName: \"kubernetes.io/projected/ed696cd9-4261-4e65-a89d-17f918249fc9-kube-api-access-m9m6f\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083292 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-certificates\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083306 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c90bc0f7-a19a-46ff-8610-5ee7274faa34-audit-dir\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083323 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-bound-sa-token\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083338 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b216ba2d-b6d1-4363-b54b-57f43f575c33-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083385 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjpr6\" (UniqueName: \"kubernetes.io/projected/bc6d0856-76af-4510-a9b8-1cefb8e82e79-kube-api-access-tjpr6\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.082745 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.083679 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.583668233 +0000 UTC m=+218.551083919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.083405 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn84v\" (UniqueName: \"kubernetes.io/projected/7b6068d6-17d2-4802-8778-e1ab076da652-kube-api-access-vn84v\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.084029 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed696cd9-4261-4e65-a89d-17f918249fc9-serving-cert\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.084070 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm8xb\" (UniqueName: \"kubernetes.io/projected/507f368b-bd66-4e0a-8e30-bc8505f3f76f-kube-api-access-jm8xb\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.084112 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-etcd-client\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.084134 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-metrics-certs\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.084153 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bc6d0856-76af-4510-a9b8-1cefb8e82e79-srv-cert\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.084178 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9k7\" (UniqueName: \"kubernetes.io/projected/e9460c02-0c7c-4f14-9950-407330c1f960-kube-api-access-2z9k7\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.089503 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f94d\" (UniqueName: \"kubernetes.io/projected/955963d3-5f3c-46c6-bfa0-6473a1238064-kube-api-access-9f94d\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.089538 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5acad2e3-ce55-43d7-b83f-75adb6b59e71-serving-cert\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.090941 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.100151 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.103447 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.104235 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-config\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.111907 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.122259 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.142036 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" event={"ID":"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c","Type":"ContainerStarted","Data":"196c036addc91d957298b3ed038e43f47c51ee217418dfbf724472ea81a6717d"} Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.142154 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.144815 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" event={"ID":"04b27022-1294-45a5-90c4-17d007e9b468","Type":"ContainerStarted","Data":"f7c0f2396263320a4d712797f891da07e944ef355ab8225f6bc56cb696d7eaf5"} Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.152158 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fh9g\" (UniqueName: \"kubernetes.io/projected/7e8298d3-3e49-4df3-9369-2623b11981cf-kube-api-access-4fh9g\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.161860 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.181461 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185115 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185275 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-tls\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185300 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-csi-data-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185550 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-stats-auth\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185591 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpdj\" (UniqueName: \"kubernetes.io/projected/c90bc0f7-a19a-46ff-8610-5ee7274faa34-kube-api-access-pvpdj\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185606 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-client-ca\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185635 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-trusted-ca\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185650 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9460c02-0c7c-4f14-9950-407330c1f960-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185665 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wm2c\" (UniqueName: \"kubernetes.io/projected/f9fc9149-f3ff-430c-9ea7-b6050cc19222-kube-api-access-4wm2c\") pod \"dns-operator-744455d44c-5clrh\" (UID: \"f9fc9149-f3ff-430c-9ea7-b6050cc19222\") " pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185683 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-serving-cert\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185707 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-serving-cert\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.185722 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186054 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6068d6-17d2-4802-8778-e1ab076da652-service-ca-bundle\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186103 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-registration-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186131 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/866e32df-fa52-499f-9a9f-dfb02663f3b5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186147 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dddbd2-8b96-4ee8-8be3-d745ce95d197-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186177 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqtx\" (UniqueName: \"kubernetes.io/projected/ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29-kube-api-access-djqtx\") pod \"ingress-canary-vkt4s\" (UID: \"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29\") " pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186192 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186228 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-config\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186247 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-default-certificate\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186261 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186336 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9460c02-0c7c-4f14-9950-407330c1f960-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186364 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b216ba2d-b6d1-4363-b54b-57f43f575c33-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186380 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0739e37-ae23-4c93-bf96-8c7f6aa72303-config\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186419 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmdj\" (UniqueName: \"kubernetes.io/projected/47af2971-9c78-4d96-a31f-8b77ea0adca2-kube-api-access-zlmdj\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186459 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/737f3034-5b9b-49b1-9be8-4a74363050c7-config-volume\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186486 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0739e37-ae23-4c93-bf96-8c7f6aa72303-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186501 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-socket-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186577 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-etcd-serving-ca\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186617 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce9452a4-e880-4740-85e9-37ed967f7c75-serving-cert\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186654 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186684 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cldjt\" (UniqueName: \"kubernetes.io/projected/2b4dabbf-243f-4504-abc1-b34b4da6a25c-kube-api-access-cldjt\") pod \"downloads-7954f5f757-vs8rs\" (UID: \"2b4dabbf-243f-4504-abc1-b34b4da6a25c\") " pod="openshift-console/downloads-7954f5f757-vs8rs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186702 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e22655-eb49-4031-926f-866b29597424-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-knp8m\" (UID: \"d8e22655-eb49-4031-926f-866b29597424\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186719 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9452a4-e880-4740-85e9-37ed967f7c75-config\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186736 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqqv7\" (UniqueName: \"kubernetes.io/projected/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-kube-api-access-jqqv7\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186755 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-image-import-ca\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186772 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mmt\" (UniqueName: \"kubernetes.io/projected/64728a68-4675-4652-a800-7f055197862b-kube-api-access-h2mmt\") pod \"auto-csr-approver-29551042-fv4ms\" (UID: \"64728a68-4675-4652-a800-7f055197862b\") " pod="openshift-infra/auto-csr-approver-29551042-fv4ms" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186789 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27k2p\" (UniqueName: \"kubernetes.io/projected/ce9452a4-e880-4740-85e9-37ed967f7c75-kube-api-access-27k2p\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186806 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186821 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186861 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-bound-sa-token\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186887 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjpr6\" (UniqueName: \"kubernetes.io/projected/bc6d0856-76af-4510-a9b8-1cefb8e82e79-kube-api-access-tjpr6\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186906 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80dcf615-4769-42bf-922e-f3ad191d8a10-images\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186923 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed696cd9-4261-4e65-a89d-17f918249fc9-serving-cert\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186938 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzb5c\" (UniqueName: \"kubernetes.io/projected/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-kube-api-access-wzb5c\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186955 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dddbd2-8b96-4ee8-8be3-d745ce95d197-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.186986 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm8xb\" (UniqueName: \"kubernetes.io/projected/507f368b-bd66-4e0a-8e30-bc8505f3f76f-kube-api-access-jm8xb\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187009 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqjr4\" (UniqueName: \"kubernetes.io/projected/d8e22655-eb49-4031-926f-866b29597424-kube-api-access-gqjr4\") pod \"package-server-manager-789f6589d5-knp8m\" (UID: \"d8e22655-eb49-4031-926f-866b29597424\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187037 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7fd7c6c0-8841-4e1a-bc44-3d029844e793-signing-key\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187060 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-metrics-certs\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.187082 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.687063752 +0000 UTC m=+218.654479428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187111 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bc6d0856-76af-4510-a9b8-1cefb8e82e79-srv-cert\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187139 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a62dccf6-39a0-499e-a596-d6e80d3c0326-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187175 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80743248-60af-4ab5-990a-be1b945af2ca-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ntntn\" (UID: \"80743248-60af-4ab5-990a-be1b945af2ca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187217 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xd4\" (UniqueName: \"kubernetes.io/projected/cd2af265-9ab6-401b-be8f-f8640d043e94-kube-api-access-r9xd4\") pod \"migrator-59844c95c7-gg2ch\" (UID: \"cd2af265-9ab6-401b-be8f-f8640d043e94\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187278 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0739e37-ae23-4c93-bf96-8c7f6aa72303-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187300 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bc6d0856-76af-4510-a9b8-1cefb8e82e79-profile-collector-cert\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187317 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl4cx\" (UniqueName: \"kubernetes.io/projected/ae648c70-55be-421c-bcdd-32922dcf947d-kube-api-access-kl4cx\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187347 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187385 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/330bced9-c478-4eb6-8b4d-10d69e5a6965-config-volume\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187405 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/866e32df-fa52-499f-9a9f-dfb02663f3b5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187423 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ae648c70-55be-421c-bcdd-32922dcf947d-certs\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187445 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/271141cc-078d-419f-b9c5-d7ef6263442d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187462 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-webhook-cert\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187482 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z497k\" (UniqueName: \"kubernetes.io/projected/80dcf615-4769-42bf-922e-f3ad191d8a10-kube-api-access-z497k\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187502 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-etcd-client\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187519 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-plugins-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187535 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b216ba2d-b6d1-4363-b54b-57f43f575c33-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187563 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njvgc\" (UniqueName: \"kubernetes.io/projected/80743248-60af-4ab5-990a-be1b945af2ca-kube-api-access-njvgc\") pod \"multus-admission-controller-857f4d67dd-ntntn\" (UID: \"80743248-60af-4ab5-990a-be1b945af2ca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187578 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62dccf6-39a0-499e-a596-d6e80d3c0326-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187595 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng49n\" (UniqueName: \"kubernetes.io/projected/60dddbd2-8b96-4ee8-8be3-d745ce95d197-kube-api-access-ng49n\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187609 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hq8x\" (UniqueName: \"kubernetes.io/projected/330bced9-c478-4eb6-8b4d-10d69e5a6965-kube-api-access-2hq8x\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187627 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mldw9\" (UniqueName: \"kubernetes.io/projected/271141cc-078d-419f-b9c5-d7ef6263442d-kube-api-access-mldw9\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187642 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gpmc\" (UniqueName: \"kubernetes.io/projected/b216ba2d-b6d1-4363-b54b-57f43f575c33-kube-api-access-9gpmc\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187669 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9fc9149-f3ff-430c-9ea7-b6050cc19222-metrics-tls\") pod \"dns-operator-744455d44c-5clrh\" (UID: \"f9fc9149-f3ff-430c-9ea7-b6050cc19222\") " pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187687 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-apiservice-cert\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187719 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/507f368b-bd66-4e0a-8e30-bc8505f3f76f-audit-dir\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187735 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62dccf6-39a0-499e-a596-d6e80d3c0326-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187764 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/271141cc-078d-419f-b9c5-d7ef6263442d-serving-cert\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187781 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/506de425-cf01-4f0a-b1d2-6e987aaf1580-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187811 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-tmpfs\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187825 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80dcf615-4769-42bf-922e-f3ad191d8a10-proxy-tls\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187855 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80dcf615-4769-42bf-922e-f3ad191d8a10-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187876 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187893 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcjjp\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-kube-api-access-kcjjp\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187909 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-config\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187925 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9460c02-0c7c-4f14-9950-407330c1f960-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187943 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-encryption-config\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187960 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/330bced9-c478-4eb6-8b4d-10d69e5a6965-secret-volume\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187977 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhxtl\" (UniqueName: \"kubernetes.io/projected/7fd7c6c0-8841-4e1a-bc44-3d029844e793-kube-api-access-nhxtl\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.187994 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29-cert\") pod \"ingress-canary-vkt4s\" (UID: \"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29\") " pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188012 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-encryption-config\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188039 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-audit\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188054 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/737f3034-5b9b-49b1-9be8-4a74363050c7-metrics-tls\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188071 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9z9b\" (UniqueName: \"kubernetes.io/projected/737f3034-5b9b-49b1-9be8-4a74363050c7-kube-api-access-f9z9b\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188086 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-mountpoint-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188104 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/507f368b-bd66-4e0a-8e30-bc8505f3f76f-node-pullsecrets\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188121 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188137 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7hq\" (UniqueName: \"kubernetes.io/projected/9b14d2ef-2e68-4bb5-be73-2bbbb837c463-kube-api-access-dl7hq\") pod \"control-plane-machine-set-operator-78cbb6b69f-kzdxt\" (UID: \"9b14d2ef-2e68-4bb5-be73-2bbbb837c463\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188152 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ae648c70-55be-421c-bcdd-32922dcf947d-node-bootstrap-token\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188170 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-audit-policies\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188186 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9m6f\" (UniqueName: \"kubernetes.io/projected/ed696cd9-4261-4e65-a89d-17f918249fc9-kube-api-access-m9m6f\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188212 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-certificates\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188229 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c90bc0f7-a19a-46ff-8610-5ee7274faa34-audit-dir\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188246 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7fd7c6c0-8841-4e1a-bc44-3d029844e793-signing-cabundle\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188288 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b216ba2d-b6d1-4363-b54b-57f43f575c33-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188318 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn84v\" (UniqueName: \"kubernetes.io/projected/7b6068d6-17d2-4802-8778-e1ab076da652-kube-api-access-vn84v\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188429 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/506de425-cf01-4f0a-b1d2-6e987aaf1580-srv-cert\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188447 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65vls\" (UniqueName: \"kubernetes.io/projected/506de425-cf01-4f0a-b1d2-6e987aaf1580-kube-api-access-65vls\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188489 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866e32df-fa52-499f-9a9f-dfb02663f3b5-config\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188526 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-etcd-client\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188731 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b14d2ef-2e68-4bb5-be73-2bbbb837c463-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kzdxt\" (UID: \"9b14d2ef-2e68-4bb5-be73-2bbbb837c463\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188759 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z9k7\" (UniqueName: \"kubernetes.io/projected/e9460c02-0c7c-4f14-9950-407330c1f960-kube-api-access-2z9k7\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.188867 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-serving-cert\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.189481 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-client-ca\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.190103 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-metrics-certs\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.190684 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-etcd-serving-ca\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.191358 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9460c02-0c7c-4f14-9950-407330c1f960-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.191570 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.191593 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-serving-cert\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.191702 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-trusted-ca\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.191749 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192469 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b6068d6-17d2-4802-8778-e1ab076da652-service-ca-bundle\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.192492 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.692476533 +0000 UTC m=+218.659892309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192554 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-etcd-client\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192691 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed696cd9-4261-4e65-a89d-17f918249fc9-serving-cert\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192712 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-config\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192799 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/507f368b-bd66-4e0a-8e30-bc8505f3f76f-audit-dir\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192769 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-encryption-config\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192878 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c90bc0f7-a19a-46ff-8610-5ee7274faa34-audit-dir\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.192927 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-image-import-ca\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.193254 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.194644 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/271141cc-078d-419f-b9c5-d7ef6263442d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.195360 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-tls\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.195726 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-config\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.196058 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/507f368b-bd66-4e0a-8e30-bc8505f3f76f-node-pullsecrets\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.197108 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/507f368b-bd66-4e0a-8e30-bc8505f3f76f-audit\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.197375 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-default-certificate\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.197435 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-certificates\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.197440 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bc6d0856-76af-4510-a9b8-1cefb8e82e79-profile-collector-cert\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.197595 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.197982 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b216ba2d-b6d1-4363-b54b-57f43f575c33-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.198025 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-audit-policies\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.198666 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c90bc0f7-a19a-46ff-8610-5ee7274faa34-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.199671 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/271141cc-078d-419f-b9c5-d7ef6263442d-serving-cert\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.200270 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7b6068d6-17d2-4802-8778-e1ab076da652-stats-auth\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.202919 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.203365 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9fc9149-f3ff-430c-9ea7-b6050cc19222-metrics-tls\") pod \"dns-operator-744455d44c-5clrh\" (UID: \"f9fc9149-f3ff-430c-9ea7-b6050cc19222\") " pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.203579 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c90bc0f7-a19a-46ff-8610-5ee7274faa34-encryption-config\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.203652 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9460c02-0c7c-4f14-9950-407330c1f960-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.203772 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b216ba2d-b6d1-4363-b54b-57f43f575c33-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.203861 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.203876 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bc6d0856-76af-4510-a9b8-1cefb8e82e79-srv-cert\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.203862 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/507f368b-bd66-4e0a-8e30-bc8505f3f76f-etcd-client\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.212524 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.221639 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.241736 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.244707 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/955963d3-5f3c-46c6-bfa0-6473a1238064-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vh56s\" (UID: \"955963d3-5f3c-46c6-bfa0-6473a1238064\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.263347 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.265562 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5acad2e3-ce55-43d7-b83f-75adb6b59e71-service-ca-bundle\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.274319 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jbdq8"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.283263 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.288179 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29wlf\" (UniqueName: \"kubernetes.io/projected/5acad2e3-ce55-43d7-b83f-75adb6b59e71-kube-api-access-29wlf\") pod \"authentication-operator-69f744f599-tnn9w\" (UID: \"5acad2e3-ce55-43d7-b83f-75adb6b59e71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.290405 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.290737 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/866e32df-fa52-499f-9a9f-dfb02663f3b5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.290815 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dddbd2-8b96-4ee8-8be3-d745ce95d197-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.290879 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqtx\" (UniqueName: \"kubernetes.io/projected/ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29-kube-api-access-djqtx\") pod \"ingress-canary-vkt4s\" (UID: \"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29\") " pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.290909 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.290957 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmdj\" (UniqueName: \"kubernetes.io/projected/47af2971-9c78-4d96-a31f-8b77ea0adca2-kube-api-access-zlmdj\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.290981 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0739e37-ae23-4c93-bf96-8c7f6aa72303-config\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291189 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0739e37-ae23-4c93-bf96-8c7f6aa72303-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291224 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/737f3034-5b9b-49b1-9be8-4a74363050c7-config-volume\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291262 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-socket-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291337 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291365 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291401 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce9452a4-e880-4740-85e9-37ed967f7c75-serving-cert\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291438 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e22655-eb49-4031-926f-866b29597424-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-knp8m\" (UID: \"d8e22655-eb49-4031-926f-866b29597424\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291464 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9452a4-e880-4740-85e9-37ed967f7c75-config\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291485 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqqv7\" (UniqueName: \"kubernetes.io/projected/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-kube-api-access-jqqv7\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291507 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mmt\" (UniqueName: \"kubernetes.io/projected/64728a68-4675-4652-a800-7f055197862b-kube-api-access-h2mmt\") pod \"auto-csr-approver-29551042-fv4ms\" (UID: \"64728a68-4675-4652-a800-7f055197862b\") " pod="openshift-infra/auto-csr-approver-29551042-fv4ms" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291533 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291556 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27k2p\" (UniqueName: \"kubernetes.io/projected/ce9452a4-e880-4740-85e9-37ed967f7c75-kube-api-access-27k2p\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291594 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80dcf615-4769-42bf-922e-f3ad191d8a10-images\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291618 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291641 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzb5c\" (UniqueName: \"kubernetes.io/projected/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-kube-api-access-wzb5c\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291664 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dddbd2-8b96-4ee8-8be3-d745ce95d197-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291693 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqjr4\" (UniqueName: \"kubernetes.io/projected/d8e22655-eb49-4031-926f-866b29597424-kube-api-access-gqjr4\") pod \"package-server-manager-789f6589d5-knp8m\" (UID: \"d8e22655-eb49-4031-926f-866b29597424\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291718 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7fd7c6c0-8841-4e1a-bc44-3d029844e793-signing-key\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291742 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a62dccf6-39a0-499e-a596-d6e80d3c0326-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291764 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80743248-60af-4ab5-990a-be1b945af2ca-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ntntn\" (UID: \"80743248-60af-4ab5-990a-be1b945af2ca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291787 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xd4\" (UniqueName: \"kubernetes.io/projected/cd2af265-9ab6-401b-be8f-f8640d043e94-kube-api-access-r9xd4\") pod \"migrator-59844c95c7-gg2ch\" (UID: \"cd2af265-9ab6-401b-be8f-f8640d043e94\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291811 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0739e37-ae23-4c93-bf96-8c7f6aa72303-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291834 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl4cx\" (UniqueName: \"kubernetes.io/projected/ae648c70-55be-421c-bcdd-32922dcf947d-kube-api-access-kl4cx\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291898 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/866e32df-fa52-499f-9a9f-dfb02663f3b5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291923 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ae648c70-55be-421c-bcdd-32922dcf947d-certs\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.291944 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/330bced9-c478-4eb6-8b4d-10d69e5a6965-config-volume\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292264 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-webhook-cert\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292297 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z497k\" (UniqueName: \"kubernetes.io/projected/80dcf615-4769-42bf-922e-f3ad191d8a10-kube-api-access-z497k\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292330 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-plugins-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292365 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njvgc\" (UniqueName: \"kubernetes.io/projected/80743248-60af-4ab5-990a-be1b945af2ca-kube-api-access-njvgc\") pod \"multus-admission-controller-857f4d67dd-ntntn\" (UID: \"80743248-60af-4ab5-990a-be1b945af2ca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292387 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62dccf6-39a0-499e-a596-d6e80d3c0326-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292413 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng49n\" (UniqueName: \"kubernetes.io/projected/60dddbd2-8b96-4ee8-8be3-d745ce95d197-kube-api-access-ng49n\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292458 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hq8x\" (UniqueName: \"kubernetes.io/projected/330bced9-c478-4eb6-8b4d-10d69e5a6965-kube-api-access-2hq8x\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292494 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-apiservice-cert\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292515 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62dccf6-39a0-499e-a596-d6e80d3c0326-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292542 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/506de425-cf01-4f0a-b1d2-6e987aaf1580-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292568 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292603 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-tmpfs\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292625 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80dcf615-4769-42bf-922e-f3ad191d8a10-proxy-tls\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292647 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80dcf615-4769-42bf-922e-f3ad191d8a10-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292680 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhxtl\" (UniqueName: \"kubernetes.io/projected/7fd7c6c0-8841-4e1a-bc44-3d029844e793-kube-api-access-nhxtl\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292700 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29-cert\") pod \"ingress-canary-vkt4s\" (UID: \"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29\") " pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292720 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/330bced9-c478-4eb6-8b4d-10d69e5a6965-secret-volume\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292743 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/737f3034-5b9b-49b1-9be8-4a74363050c7-metrics-tls\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292764 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9z9b\" (UniqueName: \"kubernetes.io/projected/737f3034-5b9b-49b1-9be8-4a74363050c7-kube-api-access-f9z9b\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292791 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7hq\" (UniqueName: \"kubernetes.io/projected/9b14d2ef-2e68-4bb5-be73-2bbbb837c463-kube-api-access-dl7hq\") pod \"control-plane-machine-set-operator-78cbb6b69f-kzdxt\" (UID: \"9b14d2ef-2e68-4bb5-be73-2bbbb837c463\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292812 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-mountpoint-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292837 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ae648c70-55be-421c-bcdd-32922dcf947d-node-bootstrap-token\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292930 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7fd7c6c0-8841-4e1a-bc44-3d029844e793-signing-cabundle\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.292977 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/506de425-cf01-4f0a-b1d2-6e987aaf1580-srv-cert\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.293001 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65vls\" (UniqueName: \"kubernetes.io/projected/506de425-cf01-4f0a-b1d2-6e987aaf1580-kube-api-access-65vls\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.293028 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b14d2ef-2e68-4bb5-be73-2bbbb837c463-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kzdxt\" (UID: \"9b14d2ef-2e68-4bb5-be73-2bbbb837c463\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.293053 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866e32df-fa52-499f-9a9f-dfb02663f3b5-config\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.293088 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-csi-data-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.293135 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-registration-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.293546 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-registration-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.293627 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.793601024 +0000 UTC m=+218.761016740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.295062 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.295574 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0739e37-ae23-4c93-bf96-8c7f6aa72303-config\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.296738 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-socket-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.297167 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.297489 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce9452a4-e880-4740-85e9-37ed967f7c75-config\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.298120 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/80dcf615-4769-42bf-922e-f3ad191d8a10-images\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.298343 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-tmpfs\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.299560 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62dccf6-39a0-499e-a596-d6e80d3c0326-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.299820 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dddbd2-8b96-4ee8-8be3-d745ce95d197-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.299863 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/737f3034-5b9b-49b1-9be8-4a74363050c7-config-volume\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.300152 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.300246 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80dcf615-4769-42bf-922e-f3ad191d8a10-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.300296 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.302298 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-plugins-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.302522 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a62dccf6-39a0-499e-a596-d6e80d3c0326-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.304612 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.304907 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dddbd2-8b96-4ee8-8be3-d745ce95d197-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.304911 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7fd7c6c0-8841-4e1a-bc44-3d029844e793-signing-cabundle\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.305175 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-mountpoint-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.305489 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-webhook-cert\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.306206 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866e32df-fa52-499f-9a9f-dfb02663f3b5-config\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.306325 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-csi-data-dir\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.306875 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29-cert\") pod \"ingress-canary-vkt4s\" (UID: \"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29\") " pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.306959 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.307508 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/330bced9-c478-4eb6-8b4d-10d69e5a6965-config-volume\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.307632 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.308199 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7e8298d3-3e49-4df3-9369-2623b11981cf-images\") pod \"machine-api-operator-5694c8668f-6qv87\" (UID: \"7e8298d3-3e49-4df3-9369-2623b11981cf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.308344 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-apiservice-cert\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.309193 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/80743248-60af-4ab5-990a-be1b945af2ca-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ntntn\" (UID: \"80743248-60af-4ab5-990a-be1b945af2ca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.309577 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/737f3034-5b9b-49b1-9be8-4a74363050c7-metrics-tls\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.309958 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8e22655-eb49-4031-926f-866b29597424-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-knp8m\" (UID: \"d8e22655-eb49-4031-926f-866b29597424\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.310577 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b14d2ef-2e68-4bb5-be73-2bbbb837c463-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kzdxt\" (UID: \"9b14d2ef-2e68-4bb5-be73-2bbbb837c463\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.311704 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/866e32df-fa52-499f-9a9f-dfb02663f3b5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.315887 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.315994 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ae648c70-55be-421c-bcdd-32922dcf947d-certs\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.316219 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ae648c70-55be-421c-bcdd-32922dcf947d-node-bootstrap-token\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.316385 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0739e37-ae23-4c93-bf96-8c7f6aa72303-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.316408 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/506de425-cf01-4f0a-b1d2-6e987aaf1580-srv-cert\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.316531 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce9452a4-e880-4740-85e9-37ed967f7c75-serving-cert\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.316929 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80dcf615-4769-42bf-922e-f3ad191d8a10-proxy-tls\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.316959 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/330bced9-c478-4eb6-8b4d-10d69e5a6965-secret-volume\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.317436 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7fd7c6c0-8841-4e1a-bc44-3d029844e793-signing-key\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.317756 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/506de425-cf01-4f0a-b1d2-6e987aaf1580-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:42 crc kubenswrapper[4703]: W0309 13:23:42.320235 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584ca99c_8678_42f1_8a73_704780debc36.slice/crio-a6c678d5557138a8b1db169f4e4fc224957c593d38fcba5086c0ed48403312a8 WatchSource:0}: Error finding container a6c678d5557138a8b1db169f4e4fc224957c593d38fcba5086c0ed48403312a8: Status 404 returned error can't find the container with id a6c678d5557138a8b1db169f4e4fc224957c593d38fcba5086c0ed48403312a8 Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.336743 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.338187 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9qw5t"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.348485 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.360739 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z9k7\" (UniqueName: \"kubernetes.io/projected/e9460c02-0c7c-4f14-9950-407330c1f960-kube-api-access-2z9k7\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.379554 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.380347 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.388971 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpdj\" (UniqueName: \"kubernetes.io/projected/c90bc0f7-a19a-46ff-8610-5ee7274faa34-kube-api-access-pvpdj\") pod \"apiserver-7bbb656c7d-gqh49\" (UID: \"c90bc0f7-a19a-46ff-8610-5ee7274faa34\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.389362 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n7xr4"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.394686 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.395147 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.895107047 +0000 UTC m=+218.862522723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.395452 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wm2c\" (UniqueName: \"kubernetes.io/projected/f9fc9149-f3ff-430c-9ea7-b6050cc19222-kube-api-access-4wm2c\") pod \"dns-operator-744455d44c-5clrh\" (UID: \"f9fc9149-f3ff-430c-9ea7-b6050cc19222\") " pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.433864 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjpr6\" (UniqueName: \"kubernetes.io/projected/bc6d0856-76af-4510-a9b8-1cefb8e82e79-kube-api-access-tjpr6\") pod \"catalog-operator-68c6474976-88z8c\" (UID: \"bc6d0856-76af-4510-a9b8-1cefb8e82e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.440608 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cldjt\" (UniqueName: \"kubernetes.io/projected/2b4dabbf-243f-4504-abc1-b34b4da6a25c-kube-api-access-cldjt\") pod \"downloads-7954f5f757-vs8rs\" (UID: \"2b4dabbf-243f-4504-abc1-b34b4da6a25c\") " pod="openshift-console/downloads-7954f5f757-vs8rs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.441791 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.452913 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ghbk9"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.467757 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm8xb\" (UniqueName: \"kubernetes.io/projected/507f368b-bd66-4e0a-8e30-bc8505f3f76f-kube-api-access-jm8xb\") pod \"apiserver-76f77b778f-k8tgg\" (UID: \"507f368b-bd66-4e0a-8e30-bc8505f3f76f\") " pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.473915 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.475409 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.477928 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mldw9\" (UniqueName: \"kubernetes.io/projected/271141cc-078d-419f-b9c5-d7ef6263442d-kube-api-access-mldw9\") pod \"openshift-config-operator-7777fb866f-gnllw\" (UID: \"271141cc-078d-419f-b9c5-d7ef6263442d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.491993 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vs8rs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.496437 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.496924 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:42.996875578 +0000 UTC m=+218.964291264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.506518 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gpmc\" (UniqueName: \"kubernetes.io/projected/b216ba2d-b6d1-4363-b54b-57f43f575c33-kube-api-access-9gpmc\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.517048 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9460c02-0c7c-4f14-9950-407330c1f960-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kn4xb\" (UID: \"e9460c02-0c7c-4f14-9950-407330c1f960\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.532564 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.537484 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-bound-sa-token\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.549158 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.562899 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9m6f\" (UniqueName: \"kubernetes.io/projected/ed696cd9-4261-4e65-a89d-17f918249fc9-kube-api-access-m9m6f\") pod \"controller-manager-879f6c89f-fjq8h\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.577245 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn84v\" (UniqueName: \"kubernetes.io/projected/7b6068d6-17d2-4802-8778-e1ab076da652-kube-api-access-vn84v\") pod \"router-default-5444994796-nzj56\" (UID: \"7b6068d6-17d2-4802-8778-e1ab076da652\") " pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.589201 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.599481 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.599928 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.099912476 +0000 UTC m=+219.067328162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.602398 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b216ba2d-b6d1-4363-b54b-57f43f575c33-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsdf8\" (UID: \"b216ba2d-b6d1-4363-b54b-57f43f575c33\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.602452 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.603423 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.606066 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.612122 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.625777 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcjjp\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-kube-api-access-kcjjp\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.627554 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.637718 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tnn9w"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.638909 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:42 crc kubenswrapper[4703]: W0309 13:23:42.644528 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955963d3_5f3c_46c6_bfa0_6473a1238064.slice/crio-6c9adf5fe3e1586c97070d0979a3ec0ad9654471b0d413985033eb9042d55274 WatchSource:0}: Error finding container 6c9adf5fe3e1586c97070d0979a3ec0ad9654471b0d413985033eb9042d55274: Status 404 returned error can't find the container with id 6c9adf5fe3e1586c97070d0979a3ec0ad9654471b0d413985033eb9042d55274 Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.648600 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/866e32df-fa52-499f-9a9f-dfb02663f3b5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hzcrb\" (UID: \"866e32df-fa52-499f-9a9f-dfb02663f3b5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.657642 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmdj\" (UniqueName: \"kubernetes.io/projected/47af2971-9c78-4d96-a31f-8b77ea0adca2-kube-api-access-zlmdj\") pod \"marketplace-operator-79b997595-ptzrr\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.690121 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqtx\" (UniqueName: \"kubernetes.io/projected/ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29-kube-api-access-djqtx\") pod \"ingress-canary-vkt4s\" (UID: \"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29\") " pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.690953 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6qv87"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.698352 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njvgc\" (UniqueName: \"kubernetes.io/projected/80743248-60af-4ab5-990a-be1b945af2ca-kube-api-access-njvgc\") pod \"multus-admission-controller-857f4d67dd-ntntn\" (UID: \"80743248-60af-4ab5-990a-be1b945af2ca\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.701671 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.201646326 +0000 UTC m=+219.169062012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.700905 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.705508 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.706077 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.206060377 +0000 UTC m=+219.173476143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.709924 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.712337 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.715486 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.717332 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0739e37-ae23-4c93-bf96-8c7f6aa72303-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6fmxt\" (UID: \"a0739e37-ae23-4c93-bf96-8c7f6aa72303\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.743832 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hq8x\" (UniqueName: \"kubernetes.io/projected/330bced9-c478-4eb6-8b4d-10d69e5a6965-kube-api-access-2hq8x\") pod \"collect-profiles-29551035-4mjnk\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.744647 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.775565 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27k2p\" (UniqueName: \"kubernetes.io/projected/ce9452a4-e880-4740-85e9-37ed967f7c75-kube-api-access-27k2p\") pod \"service-ca-operator-777779d784-wzz7z\" (UID: \"ce9452a4-e880-4740-85e9-37ed967f7c75\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.776344 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:42 crc kubenswrapper[4703]: W0309 13:23:42.777912 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8298d3_3e49_4df3_9369_2623b11981cf.slice/crio-6ceee0dacd960785b26a307e11683cff4c58c16eedbc4d99635fd0423b79585e WatchSource:0}: Error finding container 6ceee0dacd960785b26a307e11683cff4c58c16eedbc4d99635fd0423b79585e: Status 404 returned error can't find the container with id 6ceee0dacd960785b26a307e11683cff4c58c16eedbc4d99635fd0423b79585e Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.782096 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vs8rs"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.782902 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhxtl\" (UniqueName: \"kubernetes.io/projected/7fd7c6c0-8841-4e1a-bc44-3d029844e793-kube-api-access-nhxtl\") pod \"service-ca-9c57cc56f-wzpmb\" (UID: \"7fd7c6c0-8841-4e1a-bc44-3d029844e793\") " pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.800371 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqqv7\" (UniqueName: \"kubernetes.io/projected/7dbf6b37-f8d9-4a20-84e0-e9b137a29db6-kube-api-access-jqqv7\") pod \"csi-hostpathplugin-pjlkd\" (UID: \"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6\") " pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.802670 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.807468 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.809489 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.812100 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.312078564 +0000 UTC m=+219.279494250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.812159 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.812495 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.312482986 +0000 UTC m=+219.279898672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.828618 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vkt4s" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.829726 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzb5c\" (UniqueName: \"kubernetes.io/projected/3e8bfa07-13f6-43d5-aea2-0ed538eefcc7-kube-api-access-wzb5c\") pod \"packageserver-d55dfcdfc-9bpxj\" (UID: \"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.857683 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqjr4\" (UniqueName: \"kubernetes.io/projected/d8e22655-eb49-4031-926f-866b29597424-kube-api-access-gqjr4\") pod \"package-server-manager-789f6589d5-knp8m\" (UID: \"d8e22655-eb49-4031-926f-866b29597424\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.861970 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng49n\" (UniqueName: \"kubernetes.io/projected/60dddbd2-8b96-4ee8-8be3-d745ce95d197-kube-api-access-ng49n\") pod \"kube-storage-version-migrator-operator-b67b599dd-g6rg5\" (UID: \"60dddbd2-8b96-4ee8-8be3-d745ce95d197\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.874638 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49"] Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.895001 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mmt\" (UniqueName: \"kubernetes.io/projected/64728a68-4675-4652-a800-7f055197862b-kube-api-access-h2mmt\") pod \"auto-csr-approver-29551042-fv4ms\" (UID: \"64728a68-4675-4652-a800-7f055197862b\") " pod="openshift-infra/auto-csr-approver-29551042-fv4ms" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.903009 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xd4\" (UniqueName: \"kubernetes.io/projected/cd2af265-9ab6-401b-be8f-f8640d043e94-kube-api-access-r9xd4\") pod \"migrator-59844c95c7-gg2ch\" (UID: \"cd2af265-9ab6-401b-be8f-f8640d043e94\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.913395 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:42 crc kubenswrapper[4703]: E0309 13:23:42.913779 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.413758331 +0000 UTC m=+219.381174017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.919132 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a62dccf6-39a0-499e-a596-d6e80d3c0326-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jcpg\" (UID: \"a62dccf6-39a0-499e-a596-d6e80d3c0326\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.953721 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z497k\" (UniqueName: \"kubernetes.io/projected/80dcf615-4769-42bf-922e-f3ad191d8a10-kube-api-access-z497k\") pod \"machine-config-operator-74547568cd-cmpgs\" (UID: \"80dcf615-4769-42bf-922e-f3ad191d8a10\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.981202 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7hq\" (UniqueName: \"kubernetes.io/projected/9b14d2ef-2e68-4bb5-be73-2bbbb837c463-kube-api-access-dl7hq\") pod \"control-plane-machine-set-operator-78cbb6b69f-kzdxt\" (UID: \"9b14d2ef-2e68-4bb5-be73-2bbbb837c463\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.991927 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9z9b\" (UniqueName: \"kubernetes.io/projected/737f3034-5b9b-49b1-9be8-4a74363050c7-kube-api-access-f9z9b\") pod \"dns-default-trj95\" (UID: \"737f3034-5b9b-49b1-9be8-4a74363050c7\") " pod="openshift-dns/dns-default-trj95" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.998053 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" Mar 09 13:23:42 crc kubenswrapper[4703]: I0309 13:23:42.998618 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.000419 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.019611 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.019809 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65vls\" (UniqueName: \"kubernetes.io/projected/506de425-cf01-4f0a-b1d2-6e987aaf1580-kube-api-access-65vls\") pod \"olm-operator-6b444d44fb-dvmv6\" (UID: \"506de425-cf01-4f0a-b1d2-6e987aaf1580\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.020062 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.520050616 +0000 UTC m=+219.487466302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.023376 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.025694 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl4cx\" (UniqueName: \"kubernetes.io/projected/ae648c70-55be-421c-bcdd-32922dcf947d-kube-api-access-kl4cx\") pod \"machine-config-server-knk5k\" (UID: \"ae648c70-55be-421c-bcdd-32922dcf947d\") " pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.030369 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.036579 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.053432 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.059218 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.067129 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.076903 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gnllw"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.082154 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.103415 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.120818 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.121978 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.621954121 +0000 UTC m=+219.589369807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.122105 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.122827 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.622811166 +0000 UTC m=+219.590226842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.137269 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-trj95" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.145337 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-knk5k" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.214918 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.215611 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5clrh"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.217909 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" event={"ID":"955963d3-5f3c-46c6-bfa0-6473a1238064","Type":"ContainerStarted","Data":"6c9adf5fe3e1586c97070d0979a3ec0ad9654471b0d413985033eb9042d55274"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.250321 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.263201 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.263682 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.763663547 +0000 UTC m=+219.731079233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.263815 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.265876 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jbdq8" event={"ID":"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63","Type":"ContainerStarted","Data":"dff44c50a41a4d9d8eb9c02f6fe7ec0658a0da1008644857dd380c621ec43fdf"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.265918 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jbdq8" event={"ID":"98d3f0fa-d5c6-4288-af8a-bdc0b29dab63","Type":"ContainerStarted","Data":"6ce6ef9e54a16a2578a1050b06842ae451e7779efaec25b4da347ccdd315d15a"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.289491 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" event={"ID":"7e8298d3-3e49-4df3-9369-2623b11981cf","Type":"ContainerStarted","Data":"6ceee0dacd960785b26a307e11683cff4c58c16eedbc4d99635fd0423b79585e"} Mar 09 13:23:43 crc kubenswrapper[4703]: W0309 13:23:43.289588 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod271141cc_078d_419f_b9c5_d7ef6263442d.slice/crio-df3a0c207f899b335fdb4f6b3a3bb96645402c8c8a55d4635a97af89a7c63446 WatchSource:0}: Error finding container df3a0c207f899b335fdb4f6b3a3bb96645402c8c8a55d4635a97af89a7c63446: Status 404 returned error can't find the container with id df3a0c207f899b335fdb4f6b3a3bb96645402c8c8a55d4635a97af89a7c63446 Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.320696 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nzj56" event={"ID":"7b6068d6-17d2-4802-8778-e1ab076da652","Type":"ContainerStarted","Data":"52c1790590006ca8bcb0dfc7383d17b742b567df56c9b9da8344ebe35128810b"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.329217 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"82e6dfa1304d32f5215fda62bf7f54cb59ea295b60e0b8cc4e80d74182c17be3"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.347423 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" event={"ID":"584ca99c-8678-42f1-8a73-704780debc36","Type":"ContainerStarted","Data":"07e8e5e42585e99747b2579ce8f95d8df36c309c5e12f5eb9e7688b7ed5d2296"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.347491 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" event={"ID":"584ca99c-8678-42f1-8a73-704780debc36","Type":"ContainerStarted","Data":"22b580a18581440751293a86f746f1f9f61f1ca19e1a4dac6252895c0e2606d8"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.347500 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" event={"ID":"584ca99c-8678-42f1-8a73-704780debc36","Type":"ContainerStarted","Data":"a6c678d5557138a8b1db169f4e4fc224957c593d38fcba5086c0ed48403312a8"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.348766 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k8tgg"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.352229 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" event={"ID":"20cb1ef8-0711-4f38-a0aa-3a8a3953951e","Type":"ContainerStarted","Data":"4cf4a9861fb51fa6d8f77320b367d5cfc2f3823ced978a458f66fbd6dadb1ea5"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.352263 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.360678 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" event={"ID":"26e4ae04-1b0c-4cdf-a086-356bab16766e","Type":"ContainerStarted","Data":"2b1ab3d4fe9df484007424fe1d65ede9c8de5a563d033acdc43f9702ad0fc271"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.362553 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" event={"ID":"9984b825-d6a3-4756-b2dc-2a240ca82a8d","Type":"ContainerStarted","Data":"9cf9d7c5900a0f7c0a22cf4576ec3c685ec4aae89f6869028f9b829666b1d094"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.362575 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" event={"ID":"9984b825-d6a3-4756-b2dc-2a240ca82a8d","Type":"ContainerStarted","Data":"96638d3c4ab1d63a5aa7e048a51b921464b7b044ac15e4e1055bd18ac47a3d99"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.363636 4703 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ghbk9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.363684 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" podUID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.364220 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.366618 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" event={"ID":"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e","Type":"ContainerStarted","Data":"fc3e349ceb96ad80d098e225ffb75a886fd8e2c6d7d7a75bcbd0c6fe6d97695b"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.366647 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" event={"ID":"f9e0f844-c3b0-4411-ba1a-4c57a5e7796e","Type":"ContainerStarted","Data":"2d27f8637f8e15db317d96c4de69027ee7cdce6dd422bb24a6132cd5da578932"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.367635 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.368941 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.868919731 +0000 UTC m=+219.836335417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.369532 4703 patch_prober.go:28] interesting pod/console-operator-58897d9998-9qw5t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.369564 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" podUID="f9e0f844-c3b0-4411-ba1a-4c57a5e7796e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.370428 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e7034da3ef6589c52dd96ad63994109e922bd6a0ec80fce0f12960a73c7a72de"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.371962 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" event={"ID":"c90bc0f7-a19a-46ff-8610-5ee7274faa34","Type":"ContainerStarted","Data":"362a5273a429c002680c815d51f5d935f65aa9e2d064cbd2b4f7cbad5254ded3"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.405165 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" event={"ID":"04b27022-1294-45a5-90c4-17d007e9b468","Type":"ContainerStarted","Data":"8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.406189 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.408708 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.420789 4703 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tvbdd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.420855 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" podUID="04b27022-1294-45a5-90c4-17d007e9b468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 09 13:23:43 crc kubenswrapper[4703]: W0309 13:23:43.422533 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9460c02_0c7c_4f14_9950_407330c1f960.slice/crio-dce6092dfc6f74b19241648dcd88a789c92bc786738bd440456e24220e2419d7 WatchSource:0}: Error finding container dce6092dfc6f74b19241648dcd88a789c92bc786738bd440456e24220e2419d7: Status 404 returned error can't find the container with id dce6092dfc6f74b19241648dcd88a789c92bc786738bd440456e24220e2419d7 Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.427342 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" event={"ID":"8748922b-1caa-42f6-a573-9a43c160b26a","Type":"ContainerStarted","Data":"827c9f0b9293de8f4e93f40514d783f50f933bb89458b7e83c1e502f04a2db86"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.427395 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" event={"ID":"8748922b-1caa-42f6-a573-9a43c160b26a","Type":"ContainerStarted","Data":"20793a909760c29ab5a31a28b914c297412b5b79a1e17f2777608adbc5a9345b"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.441041 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vs8rs" event={"ID":"2b4dabbf-243f-4504-abc1-b34b4da6a25c","Type":"ContainerStarted","Data":"cd471eae94c28f182028fa25877bae8ceab4e99d287718ec2b1450290c497009"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.466340 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.467610 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:43.96759038 +0000 UTC m=+219.935006116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.474301 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" event={"ID":"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c","Type":"ContainerStarted","Data":"6396061b19039e5816840c12b672bb03980a6a4c5e56731d640a17124230e05a"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.484466 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" event={"ID":"5acad2e3-ce55-43d7-b83f-75adb6b59e71","Type":"ContainerStarted","Data":"5472914df301ab9fde174596904a6264caae24c39a6fd6aa1016a4d3110419c6"} Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.497208 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0f4c0b7a13b85bcfa2ddf68d43a6257b432a9fdd6cc3d2b41eb4a083ae5d7970"} Mar 09 13:23:43 crc kubenswrapper[4703]: W0309 13:23:43.508504 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6d0856_76af_4510_a9b8_1cefb8e82e79.slice/crio-1a114b2ceeabe0c1667cae9431e15d5b740e2bd084e4f5fff3c2f7d1458b3b30 WatchSource:0}: Error finding container 1a114b2ceeabe0c1667cae9431e15d5b740e2bd084e4f5fff3c2f7d1458b3b30: Status 404 returned error can't find the container with id 1a114b2ceeabe0c1667cae9431e15d5b740e2bd084e4f5fff3c2f7d1458b3b30 Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.573129 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.574606 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.074591246 +0000 UTC m=+220.042006932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.674006 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.674931 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.174590474 +0000 UTC m=+220.142006160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.677176 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.677603 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.177590563 +0000 UTC m=+220.145006249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.704038 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.730115 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.753391 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.761672 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vkt4s"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.778222 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.778569 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.27855511 +0000 UTC m=+220.245970796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.807805 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fjq8h"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.865012 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pjlkd"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.869078 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ntntn"] Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.880382 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.881010 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.380997241 +0000 UTC m=+220.348412927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: W0309 13:23:43.963021 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dbf6b37_f8d9_4a20_84e0_e9b137a29db6.slice/crio-3419bc78c8fa9e617b4ac25ec1797d82a883c3d8042124aaae36338dd95b7acf WatchSource:0}: Error finding container 3419bc78c8fa9e617b4ac25ec1797d82a883c3d8042124aaae36338dd95b7acf: Status 404 returned error can't find the container with id 3419bc78c8fa9e617b4ac25ec1797d82a883c3d8042124aaae36338dd95b7acf Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.986749 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.987010 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.486987827 +0000 UTC m=+220.454403513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:43 crc kubenswrapper[4703]: I0309 13:23:43.987205 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:43 crc kubenswrapper[4703]: E0309 13:23:43.987455 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.487448061 +0000 UTC m=+220.454863747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.080481 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" podStartSLOduration=168.080456221 podStartE2EDuration="2m48.080456221s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.076601027 +0000 UTC m=+220.044016713" watchObservedRunningTime="2026-03-09 13:23:44.080456221 +0000 UTC m=+220.047871907" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.088739 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.089116 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.589098878 +0000 UTC m=+220.556514614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.164072 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" podStartSLOduration=168.164044743 podStartE2EDuration="2m48.164044743s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.122501239 +0000 UTC m=+220.089916935" watchObservedRunningTime="2026-03-09 13:23:44.164044743 +0000 UTC m=+220.131460429" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.192716 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.193407 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.693393934 +0000 UTC m=+220.660809620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.242873 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qd4mj" podStartSLOduration=168.242829431 podStartE2EDuration="2m48.242829431s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.242754819 +0000 UTC m=+220.210170505" watchObservedRunningTime="2026-03-09 13:23:44.242829431 +0000 UTC m=+220.210245107" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.263376 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.297723 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.297795 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.797768302 +0000 UTC m=+220.765183988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.299896 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.300327 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.800315737 +0000 UTC m=+220.767731423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.401519 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.402041 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.901990995 +0000 UTC m=+220.869406691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.402312 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.402819 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:44.902791439 +0000 UTC m=+220.870207135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.477264 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" podStartSLOduration=167.477248689 podStartE2EDuration="2m47.477248689s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.440024034 +0000 UTC m=+220.407439730" watchObservedRunningTime="2026-03-09 13:23:44.477248689 +0000 UTC m=+220.444664375" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.507163 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.508482 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.008437235 +0000 UTC m=+220.975852921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.508617 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.509266 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.009235749 +0000 UTC m=+220.976651435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.514741 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" event={"ID":"b216ba2d-b6d1-4363-b54b-57f43f575c33","Type":"ContainerStarted","Data":"cd01b6d686d9c2512c1ddf151a63549a64f2f8535e46e5eb3b470b84c079ccf2"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.525685 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a883432ae3e958d3b60ee5d25703cd357e5da335bfd1fc1ad56b540c6f446257"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.533809 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.537717 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" event={"ID":"955963d3-5f3c-46c6-bfa0-6473a1238064","Type":"ContainerStarted","Data":"e740a7d63d82c1135f56af7a589d7b54c4b89d38d3ffc2243e9756023ad85097"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.558905 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.560925 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" event={"ID":"20cb1ef8-0711-4f38-a0aa-3a8a3953951e","Type":"ContainerStarted","Data":"60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.562072 4703 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ghbk9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.562122 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" podUID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.568249 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" event={"ID":"330bced9-c478-4eb6-8b4d-10d69e5a6965","Type":"ContainerStarted","Data":"30c24125a30470900945b60116c19d7cb0b565b8f2f3729c7ace7443062c69a1"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.569945 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" event={"ID":"f9fc9149-f3ff-430c-9ea7-b6050cc19222","Type":"ContainerStarted","Data":"6a5fa954450ee89033f458fd9cbc7f314eac1dbb9e4cb265f3e709646e05ac96"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.572755 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" event={"ID":"866e32df-fa52-499f-9a9f-dfb02663f3b5","Type":"ContainerStarted","Data":"c584863702dfcb8d4038d77294914798263be01d573e3e24e072b83654063bbc"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.578952 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" event={"ID":"271141cc-078d-419f-b9c5-d7ef6263442d","Type":"ContainerStarted","Data":"df3a0c207f899b335fdb4f6b3a3bb96645402c8c8a55d4635a97af89a7c63446"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.583625 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptzrr"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.601272 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.607352 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vkt4s" event={"ID":"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29","Type":"ContainerStarted","Data":"d8e993f9fb0afa4feaf692ccfc04f66c91d1479c4d3b3e9ae307cae40c1a0775"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.611237 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:44 crc kubenswrapper[4703]: W0309 13:23:44.611825 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0739e37_ae23_4c93_bf96_8c7f6aa72303.slice/crio-de61a0b756d44ebc3403556bc43f05f663afdc1ecf373b05eaca12e803479811 WatchSource:0}: Error finding container de61a0b756d44ebc3403556bc43f05f663afdc1ecf373b05eaca12e803479811: Status 404 returned error can't find the container with id de61a0b756d44ebc3403556bc43f05f663afdc1ecf373b05eaca12e803479811 Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.611975 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.111944387 +0000 UTC m=+221.079360073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.613889 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" event={"ID":"e9d54014-8ba5-4a3a-beb0-3b260f0ef64c","Type":"ContainerStarted","Data":"501a12e4816fded63afa9981e236bf35ea26757b67124eac15c6ee92e0d7257b"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.618215 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tnn9w" event={"ID":"5acad2e3-ce55-43d7-b83f-75adb6b59e71","Type":"ContainerStarted","Data":"cfe2af6d3cc31417e051aa9895791c152bfad68dc67864b00980c54f00dae0bf"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.644447 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.662040 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.162020554 +0000 UTC m=+221.129436280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.685093 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jbdq8" podStartSLOduration=168.685078618 podStartE2EDuration="2m48.685078618s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.64370636 +0000 UTC m=+220.611122046" watchObservedRunningTime="2026-03-09 13:23:44.685078618 +0000 UTC m=+220.652494304" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.724180 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" podStartSLOduration=168.724164888 podStartE2EDuration="2m48.724164888s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.722519009 +0000 UTC m=+220.689934695" watchObservedRunningTime="2026-03-09 13:23:44.724164888 +0000 UTC m=+220.691580574" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.725386 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-n7xr4" podStartSLOduration=167.725379554 podStartE2EDuration="2m47.725379554s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.684838121 +0000 UTC m=+220.652253807" watchObservedRunningTime="2026-03-09 13:23:44.725379554 +0000 UTC m=+220.692795240" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.725473 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" event={"ID":"9b14d2ef-2e68-4bb5-be73-2bbbb837c463","Type":"ContainerStarted","Data":"9260a9023cbcc1b50b9ffbd63ef22e18dce13e4fe4c2589ae0d3479492bb232b"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.733945 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wzpmb"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.734111 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" event={"ID":"ed696cd9-4261-4e65-a89d-17f918249fc9","Type":"ContainerStarted","Data":"3e23f1062eaa77e2e6ddfba8c2dcb127e264ad8105e1896739042baccc184dea"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.734946 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.735375 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" event={"ID":"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6","Type":"ContainerStarted","Data":"3419bc78c8fa9e617b4ac25ec1797d82a883c3d8042124aaae36338dd95b7acf"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.741139 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vs8rs" event={"ID":"2b4dabbf-243f-4504-abc1-b34b4da6a25c","Type":"ContainerStarted","Data":"b871c8bd5b65f55412c648dcd1544ad22ce2b10f6f8210a11d55d2b71cd8bde2"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.742339 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vs8rs" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.754791 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-vs8rs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.754856 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vs8rs" podUID="2b4dabbf-243f-4504-abc1-b34b4da6a25c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.754991 4703 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fjq8h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.755081 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" podUID="ed696cd9-4261-4e65-a89d-17f918249fc9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.759960 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-fv4ms"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.764671 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.764721 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.264709162 +0000 UTC m=+221.232124848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.765624 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.769070 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" event={"ID":"bc6d0856-76af-4510-a9b8-1cefb8e82e79","Type":"ContainerStarted","Data":"1a114b2ceeabe0c1667cae9431e15d5b740e2bd084e4f5fff3c2f7d1458b3b30"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.769800 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.770435 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.270424751 +0000 UTC m=+221.237840437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.770540 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.776284 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.776630 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-trj95"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.783407 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.785187 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" event={"ID":"26e4ae04-1b0c-4cdf-a086-356bab16766e","Type":"ContainerStarted","Data":"15f930d32a29d57a9f030e67e66fdba50418bffa11d81e38d12a48820c700f58"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.785229 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" event={"ID":"26e4ae04-1b0c-4cdf-a086-356bab16766e","Type":"ContainerStarted","Data":"be339731cb4d3179b9f9996bc131afe0034e3c87bbebc738e148cee7894202ec"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.787584 4703 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-88z8c container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.787638 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" podUID="bc6d0856-76af-4510-a9b8-1cefb8e82e79" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.805900 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.829709 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs"] Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.849377 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" event={"ID":"e9460c02-0c7c-4f14-9950-407330c1f960","Type":"ContainerStarted","Data":"dce6092dfc6f74b19241648dcd88a789c92bc786738bd440456e24220e2419d7"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.858914 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"df31d3ddf26ba1773b7ca1c038e036f3e9693b4950b45259085c20f8fcd68068"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.866502 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.868097 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.36807212 +0000 UTC m=+221.335487846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.885770 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.898828 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" event={"ID":"507f368b-bd66-4e0a-8e30-bc8505f3f76f","Type":"ContainerStarted","Data":"cef6906375ffc0e971d292f5bb0f35dad1c5fafaaf281861421dbb516ba8b9c0"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.904546 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" event={"ID":"80743248-60af-4ab5-990a-be1b945af2ca","Type":"ContainerStarted","Data":"8280143c04bab4006405aa5659dd9005365ffd7c7866c10a9d96f7695b2f7825"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.906613 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" event={"ID":"ce9452a4-e880-4740-85e9-37ed967f7c75","Type":"ContainerStarted","Data":"487fa653dbf651f20d3b0400a0cffed33ff1f890b7578d88d56ba5a2b57641bf"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.915018 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7q8lx" podStartSLOduration=167.914999463 podStartE2EDuration="2m47.914999463s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.886643521 +0000 UTC m=+220.854059217" watchObservedRunningTime="2026-03-09 13:23:44.914999463 +0000 UTC m=+220.882415149" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.917067 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" event={"ID":"7e8298d3-3e49-4df3-9369-2623b11981cf","Type":"ContainerStarted","Data":"107cbf8fc4093d5e509886ea688ba8c749fdc179925bcc6e88a3e70b680a4f5e"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.917101 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" event={"ID":"7e8298d3-3e49-4df3-9369-2623b11981cf","Type":"ContainerStarted","Data":"f9cdbdd9d2dcc7cd80c1c3306a07a1346f75f52384a453373d30d7be6cf51775"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.923419 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-knk5k" event={"ID":"ae648c70-55be-421c-bcdd-32922dcf947d","Type":"ContainerStarted","Data":"5ec153f2be60353727b1922edb892926e08cee8a4f239caadcfc0d32a05743d3"} Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.932887 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.934506 4703 patch_prober.go:28] interesting pod/console-operator-58897d9998-9qw5t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.934549 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" podUID="f9e0f844-c3b0-4411-ba1a-4c57a5e7796e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.945729 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.962461 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vs8rs" podStartSLOduration=168.962440561 podStartE2EDuration="2m48.962440561s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.954378022 +0000 UTC m=+220.921793738" watchObservedRunningTime="2026-03-09 13:23:44.962440561 +0000 UTC m=+220.929856247" Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.981346 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:44 crc kubenswrapper[4703]: E0309 13:23:44.982561 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.482546048 +0000 UTC m=+221.449961734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:44 crc kubenswrapper[4703]: I0309 13:23:44.994256 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" podStartSLOduration=167.994221744 podStartE2EDuration="2m47.994221744s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:44.990206735 +0000 UTC m=+220.957622421" watchObservedRunningTime="2026-03-09 13:23:44.994221744 +0000 UTC m=+220.961637430" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.051182 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-dhqcw" podStartSLOduration=169.051165074 podStartE2EDuration="2m49.051165074s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.049693391 +0000 UTC m=+221.017109077" watchObservedRunningTime="2026-03-09 13:23:45.051165074 +0000 UTC m=+221.018580760" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.078062 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4xzsw" podStartSLOduration=169.078045802 podStartE2EDuration="2m49.078045802s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.076674032 +0000 UTC m=+221.044089738" watchObservedRunningTime="2026-03-09 13:23:45.078045802 +0000 UTC m=+221.045461488" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.089613 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.090840 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.590821712 +0000 UTC m=+221.558237408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.128304 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" podStartSLOduration=169.128284854 podStartE2EDuration="2m49.128284854s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.127605373 +0000 UTC m=+221.095021059" watchObservedRunningTime="2026-03-09 13:23:45.128284854 +0000 UTC m=+221.095700540" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.191352 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.192718 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.692703556 +0000 UTC m=+221.660119242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.213337 4703 ???:1] "http: TLS handshake error from 192.168.126.11:38922: no serving certificate available for the kubelet" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.241204 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vh56s" podStartSLOduration=169.241185365 podStartE2EDuration="2m49.241185365s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.239367021 +0000 UTC m=+221.206782707" watchObservedRunningTime="2026-03-09 13:23:45.241185365 +0000 UTC m=+221.208601051" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.293666 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.294107 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.794087915 +0000 UTC m=+221.761503601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.301721 4703 ???:1] "http: TLS handshake error from 192.168.126.11:38930: no serving certificate available for the kubelet" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.381039 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" podStartSLOduration=168.381002515 podStartE2EDuration="2m48.381002515s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.294359763 +0000 UTC m=+221.261775449" watchObservedRunningTime="2026-03-09 13:23:45.381002515 +0000 UTC m=+221.348418201" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.397122 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.397494 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:45.897478624 +0000 UTC m=+221.864894310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.407111 4703 ???:1] "http: TLS handshake error from 192.168.126.11:38946: no serving certificate available for the kubelet" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.501153 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.501344 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.001323076 +0000 UTC m=+221.968738762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.501539 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.501923 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.001908564 +0000 UTC m=+221.969324250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.514589 4703 ???:1] "http: TLS handshake error from 192.168.126.11:38948: no serving certificate available for the kubelet" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.605727 4703 ???:1] "http: TLS handshake error from 192.168.126.11:42574: no serving certificate available for the kubelet" Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.607393 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.107368404 +0000 UTC m=+222.074784090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.608216 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.608396 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.608717 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.108708174 +0000 UTC m=+222.076123860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.663142 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6qv87" podStartSLOduration=168.663120319 podStartE2EDuration="2m48.663120319s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.653796322 +0000 UTC m=+221.621211998" watchObservedRunningTime="2026-03-09 13:23:45.663120319 +0000 UTC m=+221.630536005" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.709317 4703 ???:1] "http: TLS handshake error from 192.168.126.11:42578: no serving certificate available for the kubelet" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.710634 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.710768 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.210742672 +0000 UTC m=+222.178158348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.711261 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.711551 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.211542186 +0000 UTC m=+222.178957872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.813131 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.813741 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.313720979 +0000 UTC m=+222.281136665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.820619 4703 ???:1] "http: TLS handshake error from 192.168.126.11:42582: no serving certificate available for the kubelet" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.925033 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:45 crc kubenswrapper[4703]: E0309 13:23:45.925625 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.42561006 +0000 UTC m=+222.393025746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.970293 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-knk5k" event={"ID":"ae648c70-55be-421c-bcdd-32922dcf947d","Type":"ContainerStarted","Data":"c6dfd3ef960350e31800ed215224a44c11d0c2589472210f79348005a2d8f5bc"} Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.979424 4703 generic.go:334] "Generic (PLEG): container finished" podID="271141cc-078d-419f-b9c5-d7ef6263442d" containerID="57e9892381532e283d62584b6c1a411c310941aef47f5c9951834593a3953147" exitCode=0 Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.979518 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" event={"ID":"271141cc-078d-419f-b9c5-d7ef6263442d","Type":"ContainerDied","Data":"57e9892381532e283d62584b6c1a411c310941aef47f5c9951834593a3953147"} Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.991366 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-knk5k" podStartSLOduration=6.991350591 podStartE2EDuration="6.991350591s" podCreationTimestamp="2026-03-09 13:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:45.99096291 +0000 UTC m=+221.958378636" watchObservedRunningTime="2026-03-09 13:23:45.991350591 +0000 UTC m=+221.958766277" Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.993156 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" event={"ID":"bc6d0856-76af-4510-a9b8-1cefb8e82e79","Type":"ContainerStarted","Data":"9ed937e6393d561f93bc092c199e37babb8c373020437020fc1a98868b9e088c"} Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.994426 4703 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-88z8c container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 09 13:23:45 crc kubenswrapper[4703]: I0309 13:23:45.994510 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" podUID="bc6d0856-76af-4510-a9b8-1cefb8e82e79" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.017005 4703 generic.go:334] "Generic (PLEG): container finished" podID="c90bc0f7-a19a-46ff-8610-5ee7274faa34" containerID="6666b5de8eb66f4dd5c84e6e3045235719cdeb0f0c53cf3e367be5d21879ffdb" exitCode=0 Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.017107 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" event={"ID":"c90bc0f7-a19a-46ff-8610-5ee7274faa34","Type":"ContainerDied","Data":"6666b5de8eb66f4dd5c84e6e3045235719cdeb0f0c53cf3e367be5d21879ffdb"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.028969 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.029297 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.529279197 +0000 UTC m=+222.496694883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.029557 4703 ???:1] "http: TLS handshake error from 192.168.126.11:42592: no serving certificate available for the kubelet" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.039458 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" event={"ID":"cd2af265-9ab6-401b-be8f-f8640d043e94","Type":"ContainerStarted","Data":"7b3b20c7b1537b3dcfe54ff9723c3f4bb4ad0f0b0b1329dce03cdabd41665948"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.039510 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" event={"ID":"cd2af265-9ab6-401b-be8f-f8640d043e94","Type":"ContainerStarted","Data":"7b943178cd0a5f72b27fdc5fef9d35561c27e6c61ffbef5afd74c3022ec7a21a"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.059334 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nzj56" event={"ID":"7b6068d6-17d2-4802-8778-e1ab076da652","Type":"ContainerStarted","Data":"667885ef45082645982d96b87e5d6ff96f9a0bae91fa1a7f529d55c31f8a6725"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.073383 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" event={"ID":"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7","Type":"ContainerStarted","Data":"ff6c65005d92c765772ca002fa4d6c81dc34df3cd434c26600b9c21d2ce186d4"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.073435 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" event={"ID":"3e8bfa07-13f6-43d5-aea2-0ed538eefcc7","Type":"ContainerStarted","Data":"cb3ba1b9bf64b50d820e6755434f91efc058b6ef7b0b603391ae7868865d6f7e"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.074482 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.075831 4703 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9bpxj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.075967 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" podUID="3e8bfa07-13f6-43d5-aea2-0ed538eefcc7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.094785 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" event={"ID":"ed696cd9-4261-4e65-a89d-17f918249fc9","Type":"ContainerStarted","Data":"dbbe1b07469d53314ae644eec26cd29504b9e221cae1fe08bdc1f221bdf23dde"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.095838 4703 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fjq8h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.095884 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" podUID="ed696cd9-4261-4e65-a89d-17f918249fc9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.118599 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" event={"ID":"a0739e37-ae23-4c93-bf96-8c7f6aa72303","Type":"ContainerStarted","Data":"4aaeae34530b4c353b5636cd6fe3cee8885e35710b75c216c8b592ccd0aec2ab"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.118646 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" event={"ID":"a0739e37-ae23-4c93-bf96-8c7f6aa72303","Type":"ContainerStarted","Data":"de61a0b756d44ebc3403556bc43f05f663afdc1ecf373b05eaca12e803479811"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.121036 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" event={"ID":"b216ba2d-b6d1-4363-b54b-57f43f575c33","Type":"ContainerStarted","Data":"5c79d4842f5a49310d374edf1df518bd574e1ff7d0dd1ec7bb748539e6719f85"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.121064 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" event={"ID":"b216ba2d-b6d1-4363-b54b-57f43f575c33","Type":"ContainerStarted","Data":"0dec61119fa59129f130b22ba0b668b088587d806912c2cc66275d5362bad5f0"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.124102 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fjq8h"] Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.131434 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.133993 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.633977355 +0000 UTC m=+222.601393131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.146408 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" event={"ID":"9b14d2ef-2e68-4bb5-be73-2bbbb837c463","Type":"ContainerStarted","Data":"b3a2966e90285b0a26a9d1e6e32e797215ce9865607219e5e0d4841c2a0d1b8c"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.148619 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd"] Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.166496 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsdf8" podStartSLOduration=169.16648184 podStartE2EDuration="2m49.16648184s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.165560282 +0000 UTC m=+222.132975968" watchObservedRunningTime="2026-03-09 13:23:46.16648184 +0000 UTC m=+222.133897526" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.170929 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" event={"ID":"a62dccf6-39a0-499e-a596-d6e80d3c0326","Type":"ContainerStarted","Data":"5777687f9d0789602c7c17f12ca3d6f1882b58895da9da38b2ce552df3cc2b30"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.170974 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" event={"ID":"a62dccf6-39a0-499e-a596-d6e80d3c0326","Type":"ContainerStarted","Data":"f879a579b6b202a28cc1536b84520da6ed7fcc90533d112c4a16cf445c162885"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.187456 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vkt4s" event={"ID":"ab747cdc-f7d4-4f5a-98de-f82e2ff9fd29","Type":"ContainerStarted","Data":"33cdf3364be3a0e3d25f76a75f9e31fcc27029ebee87e0cf73c8539197aa1889"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.217582 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" event={"ID":"80dcf615-4769-42bf-922e-f3ad191d8a10","Type":"ContainerStarted","Data":"b696ae6ea21b84408238b2180f341790bcdacc5c8ca28a21a2c8516ce496c3f5"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.217634 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" event={"ID":"80dcf615-4769-42bf-922e-f3ad191d8a10","Type":"ContainerStarted","Data":"bdb8741aa84b68c7ef0268a79756078e657f44acc6e34f294cf32ef8aaecb700"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.217652 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" event={"ID":"80dcf615-4769-42bf-922e-f3ad191d8a10","Type":"ContainerStarted","Data":"28ce4832a91ae1a0e83e8a80b45b7677755b57d452e41153b843b6259fdcaf76"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.221203 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" event={"ID":"64728a68-4675-4652-a800-7f055197862b","Type":"ContainerStarted","Data":"a95a5b95b41a1da0f96330b97a3b100cd810121f09a78b3d2a63aa8174ca1c16"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.226477 4703 generic.go:334] "Generic (PLEG): container finished" podID="507f368b-bd66-4e0a-8e30-bc8505f3f76f" containerID="3dcb6c4995fc194d05a83857ca6bc9cb38aa7b8fe91aa5742aaa19d8d679458c" exitCode=0 Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.226553 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" event={"ID":"507f368b-bd66-4e0a-8e30-bc8505f3f76f","Type":"ContainerDied","Data":"3dcb6c4995fc194d05a83857ca6bc9cb38aa7b8fe91aa5742aaa19d8d679458c"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.232738 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.233048 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wzz7z" event={"ID":"ce9452a4-e880-4740-85e9-37ed967f7c75","Type":"ContainerStarted","Data":"d49599aa32dfb836bdb4697d0a7fcda58a878bd9545135e15d1f7487c9313e32"} Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.234969 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.734951142 +0000 UTC m=+222.702366828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.236951 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" event={"ID":"330bced9-c478-4eb6-8b4d-10d69e5a6965","Type":"ContainerStarted","Data":"c02a66815067f308ad8dd7aeb9a18255fe4ba2d03ae1b2ad0e708fc7d267f6b6"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.250274 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" podStartSLOduration=169.250253776 podStartE2EDuration="2m49.250253776s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.208558229 +0000 UTC m=+222.175973915" watchObservedRunningTime="2026-03-09 13:23:46.250253776 +0000 UTC m=+222.217669462" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.289355 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" event={"ID":"60dddbd2-8b96-4ee8-8be3-d745ce95d197","Type":"ContainerStarted","Data":"eef258628735b17bbc5c92514b4065096cd055142040b4cc6fbfdc59165ab95c"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.289401 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" event={"ID":"60dddbd2-8b96-4ee8-8be3-d745ce95d197","Type":"ContainerStarted","Data":"0ffa309f377e599a4c3c1a2545e9c64efe571698eb9d0982fef941866d71e97e"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.299178 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nzj56" podStartSLOduration=169.299159468 podStartE2EDuration="2m49.299159468s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.266423496 +0000 UTC m=+222.233839182" watchObservedRunningTime="2026-03-09 13:23:46.299159468 +0000 UTC m=+222.266575144" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.301539 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6fmxt" podStartSLOduration=169.301528098 podStartE2EDuration="2m49.301528098s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.300534449 +0000 UTC m=+222.267950145" watchObservedRunningTime="2026-03-09 13:23:46.301528098 +0000 UTC m=+222.268943784" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.312349 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" event={"ID":"f9fc9149-f3ff-430c-9ea7-b6050cc19222","Type":"ContainerStarted","Data":"d11a6139980a0dbcf7755525326ca2e9ee2382ee58abda15cedf630d5e87a0b2"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.312395 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" event={"ID":"f9fc9149-f3ff-430c-9ea7-b6050cc19222","Type":"ContainerStarted","Data":"b6fc0d94763577b8497ee074a4d462b348dec0f1d53548132ad9f2f4c059313d"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.335044 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.338046 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.838028702 +0000 UTC m=+222.805444428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.357706 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" event={"ID":"506de425-cf01-4f0a-b1d2-6e987aaf1580","Type":"ContainerStarted","Data":"7fff80b9a3dfbab48a0e4f6abf0f6b003d0574f86d3c763d0614c596b3ba2acd"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.357764 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" event={"ID":"506de425-cf01-4f0a-b1d2-6e987aaf1580","Type":"ContainerStarted","Data":"80abf4380dc9ff0f89c6ce0dacc5b37762ae5cb2ceccb67dbca7174c0f4d5194"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.358789 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.368375 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" event={"ID":"47af2971-9c78-4d96-a31f-8b77ea0adca2","Type":"ContainerStarted","Data":"a2b79b08c5993f1fa4d2576a57241202d122089aaed8847862fa8260cb159d13"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.368427 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" event={"ID":"47af2971-9c78-4d96-a31f-8b77ea0adca2","Type":"ContainerStarted","Data":"2192b87eda623302c4115b7d9c8957ddd613a611eeb4a06899874132bb6c2b49"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.369501 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.370956 4703 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dvmv6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.380662 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" podUID="506de425-cf01-4f0a-b1d2-6e987aaf1580" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.372316 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" event={"ID":"d8e22655-eb49-4031-926f-866b29597424","Type":"ContainerStarted","Data":"6c9721dae03dec0a816e95dc33e8fbfe15b093d82d169265e8e9f05bbfae44ee"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.381536 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.381572 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" event={"ID":"d8e22655-eb49-4031-926f-866b29597424","Type":"ContainerStarted","Data":"978334708dbc1f277b4ef0bc091cabbb28968d393d118a1bfb071ac9ac63c58d"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.381591 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" event={"ID":"d8e22655-eb49-4031-926f-866b29597424","Type":"ContainerStarted","Data":"41da8d989e8e56ee364fa0cae941bc5f1eae04c19853ce59080b486090acd796"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.380408 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vkt4s" podStartSLOduration=7.380380609 podStartE2EDuration="7.380380609s" podCreationTimestamp="2026-03-09 13:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.355229382 +0000 UTC m=+222.322645068" watchObservedRunningTime="2026-03-09 13:23:46.380380609 +0000 UTC m=+222.347796305" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.387161 4703 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ptzrr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.387222 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.390688 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" event={"ID":"7fd7c6c0-8841-4e1a-bc44-3d029844e793","Type":"ContainerStarted","Data":"74c21e8be7285b706eb71d7138ce022ae093eaa2023b36a929daba38edffb357"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.390734 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" event={"ID":"7fd7c6c0-8841-4e1a-bc44-3d029844e793","Type":"ContainerStarted","Data":"4136ffb4d1d967d9e9327707a4dc24349ce3a0ee77abac6c85e3d1a61380646b"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.405167 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-g6rg5" podStartSLOduration=169.405144604 podStartE2EDuration="2m49.405144604s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.404936948 +0000 UTC m=+222.372352634" watchObservedRunningTime="2026-03-09 13:23:46.405144604 +0000 UTC m=+222.372560290" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.423647 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" event={"ID":"866e32df-fa52-499f-9a9f-dfb02663f3b5","Type":"ContainerStarted","Data":"1f0f00e73731d6f3411816cdd610e4aea46f5115b3b7ec501a76f1cfd6bd137b"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.428793 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kzdxt" podStartSLOduration=169.428781194 podStartE2EDuration="2m49.428781194s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.428529177 +0000 UTC m=+222.395944863" watchObservedRunningTime="2026-03-09 13:23:46.428781194 +0000 UTC m=+222.396196880" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.438575 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.439697 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:46.939680428 +0000 UTC m=+222.907096114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.476417 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" event={"ID":"80743248-60af-4ab5-990a-be1b945af2ca","Type":"ContainerStarted","Data":"58881b04d2781651dab47de41d434c5e8b36180034128ff088f407e76933782a"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.504234 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cmpgs" podStartSLOduration=169.504214813 podStartE2EDuration="2m49.504214813s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.474639735 +0000 UTC m=+222.442055421" watchObservedRunningTime="2026-03-09 13:23:46.504214813 +0000 UTC m=+222.471630499" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.541558 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.542570 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" event={"ID":"e9460c02-0c7c-4f14-9950-407330c1f960","Type":"ContainerStarted","Data":"ca36aa5de73792cc3468f2079c647ce80058e410932c950be31b6ab8c03548ab"} Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.544252 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.044238251 +0000 UTC m=+223.011653937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.556422 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" podStartSLOduration=170.556378572 podStartE2EDuration="2m50.556378572s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.512752217 +0000 UTC m=+222.480167903" watchObservedRunningTime="2026-03-09 13:23:46.556378572 +0000 UTC m=+222.523794268" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.585219 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jcpg" podStartSLOduration=169.585202057 podStartE2EDuration="2m49.585202057s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.55563933 +0000 UTC m=+222.523055016" watchObservedRunningTime="2026-03-09 13:23:46.585202057 +0000 UTC m=+222.552617743" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.610170 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1b870a9865cc45029c69042bc50a46dabe83511b009e853defae8da5fdf5515e"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.613251 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.620491 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:46 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:46 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:46 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.620541 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.620522 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" podStartSLOduration=169.620505005 podStartE2EDuration="2m49.620505005s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.618702472 +0000 UTC m=+222.586118158" watchObservedRunningTime="2026-03-09 13:23:46.620505005 +0000 UTC m=+222.587920691" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.630190 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-trj95" event={"ID":"737f3034-5b9b-49b1-9be8-4a74363050c7","Type":"ContainerStarted","Data":"ff7563fb47d2d909835f33509e949bc32040d24c2c4c1b927ede520d6c5f2e43"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.630242 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-trj95" event={"ID":"737f3034-5b9b-49b1-9be8-4a74363050c7","Type":"ContainerStarted","Data":"49373dba9cfa0b24fe7035f5b71734d80669e7fa84c5104a397fd55578214bbe"} Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.636142 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-vs8rs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.636206 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vs8rs" podUID="2b4dabbf-243f-4504-abc1-b34b4da6a25c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.644295 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.645400 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.145382004 +0000 UTC m=+223.112797690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.646159 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.650144 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wzpmb" podStartSLOduration=169.650123074 podStartE2EDuration="2m49.650123074s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.646453095 +0000 UTC m=+222.613868781" watchObservedRunningTime="2026-03-09 13:23:46.650123074 +0000 UTC m=+222.617538760" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.665758 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9qw5t" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.718736 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" podStartSLOduration=169.71871733 podStartE2EDuration="2m49.71871733s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.676021263 +0000 UTC m=+222.643436949" watchObservedRunningTime="2026-03-09 13:23:46.71871733 +0000 UTC m=+222.686133016" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.720703 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kn4xb" podStartSLOduration=170.720693949 podStartE2EDuration="2m50.720693949s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.718324399 +0000 UTC m=+222.685740085" watchObservedRunningTime="2026-03-09 13:23:46.720693949 +0000 UTC m=+222.688109635" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.723873 4703 ???:1] "http: TLS handshake error from 192.168.126.11:42598: no serving certificate available for the kubelet" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.746402 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.754480 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.254463101 +0000 UTC m=+223.221878877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.808736 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hzcrb" podStartSLOduration=169.808716442 podStartE2EDuration="2m49.808716442s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.77092261 +0000 UTC m=+222.738338296" watchObservedRunningTime="2026-03-09 13:23:46.808716442 +0000 UTC m=+222.776132138" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.839546 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" podStartSLOduration=169.839531806 podStartE2EDuration="2m49.839531806s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.837231058 +0000 UTC m=+222.804646744" watchObservedRunningTime="2026-03-09 13:23:46.839531806 +0000 UTC m=+222.806947492" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.839977 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" podStartSLOduration=169.839970839 podStartE2EDuration="2m49.839970839s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.811811164 +0000 UTC m=+222.779226850" watchObservedRunningTime="2026-03-09 13:23:46.839970839 +0000 UTC m=+222.807386525" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.847989 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.848264 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.348246215 +0000 UTC m=+223.315661901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.882971 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5clrh" podStartSLOduration=170.882952725 podStartE2EDuration="2m50.882952725s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.882140101 +0000 UTC m=+222.849555787" watchObservedRunningTime="2026-03-09 13:23:46.882952725 +0000 UTC m=+222.850368411" Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.949766 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:46 crc kubenswrapper[4703]: E0309 13:23:46.950194 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.450179581 +0000 UTC m=+223.417595267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:46 crc kubenswrapper[4703]: I0309 13:23:46.958677 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-trj95" podStartSLOduration=7.958659952 podStartE2EDuration="7.958659952s" podCreationTimestamp="2026-03-09 13:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:46.955397556 +0000 UTC m=+222.922813242" watchObservedRunningTime="2026-03-09 13:23:46.958659952 +0000 UTC m=+222.926075648" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.063928 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.064404 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.564384951 +0000 UTC m=+223.531800637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.166087 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.166399 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.666384848 +0000 UTC m=+223.633800534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.266708 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.266877 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.76685653 +0000 UTC m=+223.734272216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.267276 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.267556 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.767544581 +0000 UTC m=+223.734960267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.367892 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.368295 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.86826537 +0000 UTC m=+223.835681056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.470046 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.470530 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:47.970499135 +0000 UTC m=+223.937914821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.571173 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.571370 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.071348228 +0000 UTC m=+224.038763934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.571469 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.571723 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.071715209 +0000 UTC m=+224.039130895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.619264 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:47 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:47 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:47 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.619336 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.641906 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ntntn" event={"ID":"80743248-60af-4ab5-990a-be1b945af2ca","Type":"ContainerStarted","Data":"af4542ad3ae62a72cc03047833f24e2049b7625edc2b1a89cc4cf89e093bf6f2"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.650338 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-trj95" event={"ID":"737f3034-5b9b-49b1-9be8-4a74363050c7","Type":"ContainerStarted","Data":"f0a879502eae34df6743cb05be79968369c673c76a08d101ea0a657edfaaae2d"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.651112 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-trj95" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.669554 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" event={"ID":"c90bc0f7-a19a-46ff-8610-5ee7274faa34","Type":"ContainerStarted","Data":"63c22b9f8d865a870e32e937e2ce6bd1df479a7bf741e581acb8f5cd82e8c5c5"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.675675 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.676087 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.176068357 +0000 UTC m=+224.143484043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.683025 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" event={"ID":"507f368b-bd66-4e0a-8e30-bc8505f3f76f","Type":"ContainerStarted","Data":"cb95c2980b1016597cc12b36fada92bd7c3c789adb5ea0027f9d5cca1704d621"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.683083 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" event={"ID":"507f368b-bd66-4e0a-8e30-bc8505f3f76f","Type":"ContainerStarted","Data":"d4d1fbb91eb43afd99478d87df572d42aea225ac9aeedf0daadda15a3f02f3af"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.687193 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" event={"ID":"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6","Type":"ContainerStarted","Data":"24b03360eb4ace51d6b7ed7b3a64d180ebeb401033cccf5aa36fd3792baeac4e"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.692962 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" podStartSLOduration=170.692946768 podStartE2EDuration="2m50.692946768s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:47.689012481 +0000 UTC m=+223.656428167" watchObservedRunningTime="2026-03-09 13:23:47.692946768 +0000 UTC m=+223.660362444" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.703494 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" event={"ID":"cd2af265-9ab6-401b-be8f-f8640d043e94","Type":"ContainerStarted","Data":"3db9eccc323e51df9b272acafcb13714debe1128b66d549a4bb580d329ef71f1"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.712561 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.712600 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.713281 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" event={"ID":"271141cc-078d-419f-b9c5-d7ef6263442d","Type":"ContainerStarted","Data":"a9d957cdbcfec8a41c61c87cd00dcc490942943790a5d0b0a148516f6d6f0ef2"} Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.713664 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" podUID="04b27022-1294-45a5-90c4-17d007e9b468" containerName="route-controller-manager" containerID="cri-o://8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48" gracePeriod=30 Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.716128 4703 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ptzrr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.716163 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-vs8rs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.716174 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.716189 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vs8rs" podUID="2b4dabbf-243f-4504-abc1-b34b4da6a25c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.716183 4703 patch_prober.go:28] interesting pod/apiserver-76f77b778f-k8tgg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.716219 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" podUID="507f368b-bd66-4e0a-8e30-bc8505f3f76f" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.716320 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.722154 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" podStartSLOduration=171.722137564 podStartE2EDuration="2m51.722137564s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:47.721253558 +0000 UTC m=+223.688669264" watchObservedRunningTime="2026-03-09 13:23:47.722137564 +0000 UTC m=+223.689553250" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.729765 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-88z8c" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.739811 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.758487 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" podStartSLOduration=171.758465443 podStartE2EDuration="2m51.758465443s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:47.755428702 +0000 UTC m=+223.722844388" watchObservedRunningTime="2026-03-09 13:23:47.758465443 +0000 UTC m=+223.725881129" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.768174 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dvmv6" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.779692 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.797695 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.297674806 +0000 UTC m=+224.265090562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.823329 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gg2ch" podStartSLOduration=170.823298527 podStartE2EDuration="2m50.823298527s" podCreationTimestamp="2026-03-09 13:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:47.783109574 +0000 UTC m=+223.750525260" watchObservedRunningTime="2026-03-09 13:23:47.823298527 +0000 UTC m=+223.790714213" Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.884336 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.885066 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.38504968 +0000 UTC m=+224.352465356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:47 crc kubenswrapper[4703]: I0309 13:23:47.987404 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:47 crc kubenswrapper[4703]: E0309 13:23:47.987708 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.487697257 +0000 UTC m=+224.455112943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.056186 4703 ???:1] "http: TLS handshake error from 192.168.126.11:42604: no serving certificate available for the kubelet" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.088887 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.089096 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.589071986 +0000 UTC m=+224.556487672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.089195 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.089524 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.589516859 +0000 UTC m=+224.556932545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.190277 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.190424 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.690397833 +0000 UTC m=+224.657813519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.190620 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.190965 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.69095321 +0000 UTC m=+224.658368896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.291615 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.291725 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.79170931 +0000 UTC m=+224.759124996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.292035 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.292423 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.792415621 +0000 UTC m=+224.759831307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.394998 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.395394 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.895380128 +0000 UTC m=+224.862795814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.413046 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9bpxj" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.496698 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.498558 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.499010 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:48.998997123 +0000 UTC m=+224.966412809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.599354 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.599409 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-config\") pod \"04b27022-1294-45a5-90c4-17d007e9b468\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.599483 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wft66\" (UniqueName: \"kubernetes.io/projected/04b27022-1294-45a5-90c4-17d007e9b468-kube-api-access-wft66\") pod \"04b27022-1294-45a5-90c4-17d007e9b468\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.599515 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04b27022-1294-45a5-90c4-17d007e9b468-serving-cert\") pod \"04b27022-1294-45a5-90c4-17d007e9b468\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.599580 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-client-ca\") pod \"04b27022-1294-45a5-90c4-17d007e9b468\" (UID: \"04b27022-1294-45a5-90c4-17d007e9b468\") " Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.600442 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.100417474 +0000 UTC m=+225.067833160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.600689 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-config" (OuterVolumeSpecName: "config") pod "04b27022-1294-45a5-90c4-17d007e9b468" (UID: "04b27022-1294-45a5-90c4-17d007e9b468"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.601022 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-client-ca" (OuterVolumeSpecName: "client-ca") pod "04b27022-1294-45a5-90c4-17d007e9b468" (UID: "04b27022-1294-45a5-90c4-17d007e9b468"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.618017 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:48 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:48 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:48 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.618075 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.627019 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b27022-1294-45a5-90c4-17d007e9b468-kube-api-access-wft66" (OuterVolumeSpecName: "kube-api-access-wft66") pod "04b27022-1294-45a5-90c4-17d007e9b468" (UID: "04b27022-1294-45a5-90c4-17d007e9b468"). InnerVolumeSpecName "kube-api-access-wft66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.627934 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b27022-1294-45a5-90c4-17d007e9b468-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "04b27022-1294-45a5-90c4-17d007e9b468" (UID: "04b27022-1294-45a5-90c4-17d007e9b468"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.701239 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kvftn"] Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.701351 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.701480 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04b27022-1294-45a5-90c4-17d007e9b468-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.701493 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.701502 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04b27022-1294-45a5-90c4-17d007e9b468-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.701510 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wft66\" (UniqueName: \"kubernetes.io/projected/04b27022-1294-45a5-90c4-17d007e9b468-kube-api-access-wft66\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.701482 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b27022-1294-45a5-90c4-17d007e9b468" containerName="route-controller-manager" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.701530 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b27022-1294-45a5-90c4-17d007e9b468" containerName="route-controller-manager" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.701820 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.201805363 +0000 UTC m=+225.169221049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.702249 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b27022-1294-45a5-90c4-17d007e9b468" containerName="route-controller-manager" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.703224 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.732308 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.749058 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kvftn"] Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.783937 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zc87r"] Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.784821 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.788235 4703 generic.go:334] "Generic (PLEG): container finished" podID="04b27022-1294-45a5-90c4-17d007e9b468" containerID="8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48" exitCode=0 Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.788579 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" podUID="ed696cd9-4261-4e65-a89d-17f918249fc9" containerName="controller-manager" containerID="cri-o://dbbe1b07469d53314ae644eec26cd29504b9e221cae1fe08bdc1f221bdf23dde" gracePeriod=30 Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.788290 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.788306 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" event={"ID":"04b27022-1294-45a5-90c4-17d007e9b468","Type":"ContainerDied","Data":"8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48"} Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.789926 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd" event={"ID":"04b27022-1294-45a5-90c4-17d007e9b468","Type":"ContainerDied","Data":"f7c0f2396263320a4d712797f891da07e944ef355ab8225f6bc56cb696d7eaf5"} Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.789957 4703 scope.go:117] "RemoveContainer" containerID="8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.794061 4703 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ptzrr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.794126 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.800920 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.803469 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.803771 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8nd\" (UniqueName: \"kubernetes.io/projected/174b5189-1afe-40a3-813b-052dd29ad296-kube-api-access-bf8nd\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.803811 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-catalog-content\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.804081 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-utilities\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.804335 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.304294415 +0000 UTC m=+225.271710101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.846101 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zc87r"] Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.875160 4703 scope.go:117] "RemoveContainer" containerID="8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48" Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.883924 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48\": container with ID starting with 8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48 not found: ID does not exist" containerID="8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.883977 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48"} err="failed to get container status \"8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48\": rpc error: code = NotFound desc = could not find container \"8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48\": container with ID starting with 8e14d39b7d6d857cf5f7aec953b73acebc6a9da60c39b571f4170a2f8709fe48 not found: ID does not exist" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.908562 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.908609 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-catalog-content\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.908705 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskpx\" (UniqueName: \"kubernetes.io/projected/0ab69d10-042e-422a-830e-65d3d1132197-kube-api-access-fskpx\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.908775 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8nd\" (UniqueName: \"kubernetes.io/projected/174b5189-1afe-40a3-813b-052dd29ad296-kube-api-access-bf8nd\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.908821 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-catalog-content\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.908890 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-utilities\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.908974 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-utilities\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.910326 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-catalog-content\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.919030 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd"] Mar 09 13:23:48 crc kubenswrapper[4703]: E0309 13:23:48.920754 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.420735611 +0000 UTC m=+225.388151367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.921451 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-utilities\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.934493 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvbdd"] Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.945865 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9gzfj"] Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.950674 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.952299 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8nd\" (UniqueName: \"kubernetes.io/projected/174b5189-1afe-40a3-813b-052dd29ad296-kube-api-access-bf8nd\") pod \"community-operators-kvftn\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:48 crc kubenswrapper[4703]: I0309 13:23:48.976054 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gzfj"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.011400 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.011665 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-catalog-content\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.011701 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskpx\" (UniqueName: \"kubernetes.io/projected/0ab69d10-042e-422a-830e-65d3d1132197-kube-api-access-fskpx\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.011734 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-utilities\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.011804 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.511786454 +0000 UTC m=+225.479202140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.012154 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-catalog-content\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.012214 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-utilities\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.031430 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskpx\" (UniqueName: \"kubernetes.io/projected/0ab69d10-042e-422a-830e-65d3d1132197-kube-api-access-fskpx\") pod \"certified-operators-zc87r\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.071586 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.113487 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-catalog-content\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.113531 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-utilities\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.113571 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.113598 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwqvr\" (UniqueName: \"kubernetes.io/projected/7c0fe86d-3006-44b3-811a-f28758ef07fd-kube-api-access-mwqvr\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.113977 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.613944876 +0000 UTC m=+225.581360562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.145483 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9tq2q"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.146481 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.156132 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.165798 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tq2q"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.215117 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.215505 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-catalog-content\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.215589 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gttc\" (UniqueName: \"kubernetes.io/projected/adee29a3-c626-4dc3-8323-ce1e852faea3-kube-api-access-6gttc\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.215643 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-catalog-content\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.215687 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-utilities\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.215732 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-utilities\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.215778 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwqvr\" (UniqueName: \"kubernetes.io/projected/7c0fe86d-3006-44b3-811a-f28758ef07fd-kube-api-access-mwqvr\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.216252 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.716234303 +0000 UTC m=+225.683649999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.216696 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-catalog-content\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.217005 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-utilities\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.244604 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwqvr\" (UniqueName: \"kubernetes.io/projected/7c0fe86d-3006-44b3-811a-f28758ef07fd-kube-api-access-mwqvr\") pod \"community-operators-9gzfj\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.316827 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.316896 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-utilities\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.316947 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-catalog-content\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.316994 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gttc\" (UniqueName: \"kubernetes.io/projected/adee29a3-c626-4dc3-8323-ce1e852faea3-kube-api-access-6gttc\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.317532 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.817519799 +0000 UTC m=+225.784935485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.318079 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-utilities\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.318306 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-catalog-content\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.343194 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gttc\" (UniqueName: \"kubernetes.io/projected/adee29a3-c626-4dc3-8323-ce1e852faea3-kube-api-access-6gttc\") pod \"certified-operators-9tq2q\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.419436 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.419664 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.91963461 +0000 UTC m=+225.887050296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.420246 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.420629 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:49.920618139 +0000 UTC m=+225.888033875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.447429 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zc87r"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.455155 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.472481 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:23:49 crc kubenswrapper[4703]: W0309 13:23:49.482599 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab69d10_042e_422a_830e_65d3d1132197.slice/crio-d005c1640c941b89e9f88e52bcb4d0cd3aafdf1fe2822f6a1909746ef12472d7 WatchSource:0}: Error finding container d005c1640c941b89e9f88e52bcb4d0cd3aafdf1fe2822f6a1909746ef12472d7: Status 404 returned error can't find the container with id d005c1640c941b89e9f88e52bcb4d0cd3aafdf1fe2822f6a1909746ef12472d7 Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.493660 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kvftn"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.501297 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.502026 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.506177 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.509341 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.509791 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.510084 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.519975 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.520318 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.520467 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.521205 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.521367 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.021337829 +0000 UTC m=+225.988753515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.521492 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.521818 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.021802523 +0000 UTC m=+225.989218209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: W0309 13:23:49.549347 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod174b5189_1afe_40a3_813b_052dd29ad296.slice/crio-19c102f26fa4152fccd473aef64ecc2ff89db438221496422ffbff4fa3cc56c6 WatchSource:0}: Error finding container 19c102f26fa4152fccd473aef64ecc2ff89db438221496422ffbff4fa3cc56c6: Status 404 returned error can't find the container with id 19c102f26fa4152fccd473aef64ecc2ff89db438221496422ffbff4fa3cc56c6 Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.613933 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:49 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:49 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:49 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.613994 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.622394 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.622545 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-config\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.622690 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-serving-cert\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.622769 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-client-ca\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.622793 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8mq\" (UniqueName: \"kubernetes.io/projected/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-kube-api-access-7z8mq\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.623532 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.123516662 +0000 UTC m=+226.090932348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.725059 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z8mq\" (UniqueName: \"kubernetes.io/projected/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-kube-api-access-7z8mq\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.725431 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-config\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.725488 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.725518 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-serving-cert\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.725573 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-client-ca\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.726497 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-client-ca\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.727770 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-config\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.729099 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.229086405 +0000 UTC m=+226.196502091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.735745 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-serving-cert\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.749927 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z8mq\" (UniqueName: \"kubernetes.io/projected/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-kube-api-access-7z8mq\") pod \"route-controller-manager-77dd4cd8f8-25pv5\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.810297 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerStarted","Data":"e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e"} Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.810335 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerStarted","Data":"d005c1640c941b89e9f88e52bcb4d0cd3aafdf1fe2822f6a1909746ef12472d7"} Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.811593 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerStarted","Data":"0cd02cd3e237805eda7959fefad7349da3574c1f5972d6c49c39f9d096f6baab"} Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.811613 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerStarted","Data":"19c102f26fa4152fccd473aef64ecc2ff89db438221496422ffbff4fa3cc56c6"} Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.818607 4703 generic.go:334] "Generic (PLEG): container finished" podID="ed696cd9-4261-4e65-a89d-17f918249fc9" containerID="dbbe1b07469d53314ae644eec26cd29504b9e221cae1fe08bdc1f221bdf23dde" exitCode=0 Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.818660 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" event={"ID":"ed696cd9-4261-4e65-a89d-17f918249fc9","Type":"ContainerDied","Data":"dbbe1b07469d53314ae644eec26cd29504b9e221cae1fe08bdc1f221bdf23dde"} Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.827626 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.828018 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.328004761 +0000 UTC m=+226.295420447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.836073 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.908344 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tq2q"] Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.929949 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:49 crc kubenswrapper[4703]: E0309 13:23:49.931676 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.431662148 +0000 UTC m=+226.399077834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:49 crc kubenswrapper[4703]: W0309 13:23:49.950734 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadee29a3_c626_4dc3_8323_ce1e852faea3.slice/crio-a79b07add877772bfcd1f6fbc83d5ce903f6d5a0078394174bbfbd686558f956 WatchSource:0}: Error finding container a79b07add877772bfcd1f6fbc83d5ce903f6d5a0078394174bbfbd686558f956: Status 404 returned error can't find the container with id a79b07add877772bfcd1f6fbc83d5ce903f6d5a0078394174bbfbd686558f956 Mar 09 13:23:49 crc kubenswrapper[4703]: I0309 13:23:49.961654 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.031780 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-client-ca\") pod \"ed696cd9-4261-4e65-a89d-17f918249fc9\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.031834 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-proxy-ca-bundles\") pod \"ed696cd9-4261-4e65-a89d-17f918249fc9\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.031877 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-config\") pod \"ed696cd9-4261-4e65-a89d-17f918249fc9\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.032035 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.032065 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9m6f\" (UniqueName: \"kubernetes.io/projected/ed696cd9-4261-4e65-a89d-17f918249fc9-kube-api-access-m9m6f\") pod \"ed696cd9-4261-4e65-a89d-17f918249fc9\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.032103 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed696cd9-4261-4e65-a89d-17f918249fc9-serving-cert\") pod \"ed696cd9-4261-4e65-a89d-17f918249fc9\" (UID: \"ed696cd9-4261-4e65-a89d-17f918249fc9\") " Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.032890 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.532864661 +0000 UTC m=+226.500280347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.034315 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-config" (OuterVolumeSpecName: "config") pod "ed696cd9-4261-4e65-a89d-17f918249fc9" (UID: "ed696cd9-4261-4e65-a89d-17f918249fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.034312 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.034343 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "ed696cd9-4261-4e65-a89d-17f918249fc9" (UID: "ed696cd9-4261-4e65-a89d-17f918249fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.034631 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.534616033 +0000 UTC m=+226.502031779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.034792 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ed696cd9-4261-4e65-a89d-17f918249fc9" (UID: "ed696cd9-4261-4e65-a89d-17f918249fc9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.035241 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.035260 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.035271 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed696cd9-4261-4e65-a89d-17f918249fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.040571 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed696cd9-4261-4e65-a89d-17f918249fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ed696cd9-4261-4e65-a89d-17f918249fc9" (UID: "ed696cd9-4261-4e65-a89d-17f918249fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.056782 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed696cd9-4261-4e65-a89d-17f918249fc9-kube-api-access-m9m6f" (OuterVolumeSpecName: "kube-api-access-m9m6f") pod "ed696cd9-4261-4e65-a89d-17f918249fc9" (UID: "ed696cd9-4261-4e65-a89d-17f918249fc9"). InnerVolumeSpecName "kube-api-access-m9m6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.066089 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gnllw" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.136475 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.136796 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed696cd9-4261-4e65-a89d-17f918249fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.136816 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9m6f\" (UniqueName: \"kubernetes.io/projected/ed696cd9-4261-4e65-a89d-17f918249fc9-kube-api-access-m9m6f\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.136915 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.636898149 +0000 UTC m=+226.604313835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.143179 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gzfj"] Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.186080 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5"] Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.238305 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.239327 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.739315299 +0000 UTC m=+226.706730985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.340636 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.341034 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.841018408 +0000 UTC m=+226.808434094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.441952 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.442496 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:50.94248228 +0000 UTC m=+226.909897966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.542903 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.543302 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.043266711 +0000 UTC m=+227.010682407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.580779 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.581890 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed696cd9-4261-4e65-a89d-17f918249fc9" containerName="controller-manager" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.581947 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed696cd9-4261-4e65-a89d-17f918249fc9" containerName="controller-manager" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.582267 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed696cd9-4261-4e65-a89d-17f918249fc9" containerName="controller-manager" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.582875 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.584610 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.587487 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.590820 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.611384 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:50 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:50 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:50 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.611447 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.645272 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dac915c-5ef5-4e57-ba83-afe822e716e9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.645318 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dac915c-5ef5-4e57-ba83-afe822e716e9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.645349 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.645660 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.14564894 +0000 UTC m=+227.113064626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.656306 4703 ???:1] "http: TLS handshake error from 192.168.126.11:42608: no serving certificate available for the kubelet" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.720260 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b27022-1294-45a5-90c4-17d007e9b468" path="/var/lib/kubelet/pods/04b27022-1294-45a5-90c4-17d007e9b468/volumes" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.740469 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j5rmg"] Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.741281 4703 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.741565 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.745156 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.749337 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.749626 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dac915c-5ef5-4e57-ba83-afe822e716e9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.749669 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dac915c-5ef5-4e57-ba83-afe822e716e9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.749814 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dac915c-5ef5-4e57-ba83-afe822e716e9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.749906 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.249892414 +0000 UTC m=+227.217308100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.759931 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5rmg"] Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.782165 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dac915c-5ef5-4e57-ba83-afe822e716e9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.845276 4703 generic.go:334] "Generic (PLEG): container finished" podID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerID="0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938" exitCode=0 Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.845736 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gzfj" event={"ID":"7c0fe86d-3006-44b3-811a-f28758ef07fd","Type":"ContainerDied","Data":"0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.845804 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gzfj" event={"ID":"7c0fe86d-3006-44b3-811a-f28758ef07fd","Type":"ContainerStarted","Data":"ae456d88e5213923bd809ea0cefaa26759897e2a30818214d10cca2732e31ce8"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.851068 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-utilities\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.851104 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnv65\" (UniqueName: \"kubernetes.io/projected/a7eceeab-0758-4f0f-88c7-5b744fbab868-kube-api-access-rnv65\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.851153 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.851176 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-catalog-content\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.851461 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.351449819 +0000 UTC m=+227.318865505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.853151 4703 generic.go:334] "Generic (PLEG): container finished" podID="0ab69d10-042e-422a-830e-65d3d1132197" containerID="e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e" exitCode=0 Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.853216 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerDied","Data":"e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.855369 4703 generic.go:334] "Generic (PLEG): container finished" podID="174b5189-1afe-40a3-813b-052dd29ad296" containerID="0cd02cd3e237805eda7959fefad7349da3574c1f5972d6c49c39f9d096f6baab" exitCode=0 Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.855437 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerDied","Data":"0cd02cd3e237805eda7959fefad7349da3574c1f5972d6c49c39f9d096f6baab"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.863496 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" event={"ID":"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32","Type":"ContainerStarted","Data":"cad6b47c96f1e6cfd1ea7d10f7acba50659d996a63dbff63c0eb08e129147695"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.863593 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" event={"ID":"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32","Type":"ContainerStarted","Data":"f1058e9a3c339061f156d3668ce821e1daa0253f20b9550d7dc48eae36bafc2e"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.863881 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.879141 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.880105 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fjq8h" event={"ID":"ed696cd9-4261-4e65-a89d-17f918249fc9","Type":"ContainerDied","Data":"3e23f1062eaa77e2e6ddfba8c2dcb127e264ad8105e1896739042baccc184dea"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.880159 4703 scope.go:117] "RemoveContainer" containerID="dbbe1b07469d53314ae644eec26cd29504b9e221cae1fe08bdc1f221bdf23dde" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.885891 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.886292 4703 generic.go:334] "Generic (PLEG): container finished" podID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerID="41e1c1cf8f3dfd6382fe1d8bdb300394e54950f0864775a17b5cccef6a5fbd85" exitCode=0 Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.886986 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tq2q" event={"ID":"adee29a3-c626-4dc3-8323-ce1e852faea3","Type":"ContainerDied","Data":"41e1c1cf8f3dfd6382fe1d8bdb300394e54950f0864775a17b5cccef6a5fbd85"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.887009 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tq2q" event={"ID":"adee29a3-c626-4dc3-8323-ce1e852faea3","Type":"ContainerStarted","Data":"a79b07add877772bfcd1f6fbc83d5ce903f6d5a0078394174bbfbd686558f956"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.893931 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" event={"ID":"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6","Type":"ContainerStarted","Data":"cbb67ab1a2744cf302e26ab535a0a3014e9acae3265230fa5a717efcfe283fa1"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.896070 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" event={"ID":"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6","Type":"ContainerStarted","Data":"0a5651641c1e48dd01b8b5bf02779a8ffd95aefc40383521341cad712ad45bbb"} Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.901164 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.917168 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" podStartSLOduration=3.917151369 podStartE2EDuration="3.917151369s" podCreationTimestamp="2026-03-09 13:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:50.913596533 +0000 UTC m=+226.881012229" watchObservedRunningTime="2026-03-09 13:23:50.917151369 +0000 UTC m=+226.884567055" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.952292 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.952681 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-catalog-content\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.952806 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-utilities\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.952860 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnv65\" (UniqueName: \"kubernetes.io/projected/a7eceeab-0758-4f0f-88c7-5b744fbab868-kube-api-access-rnv65\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.953906 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-utilities\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: E0309 13:23:50.954019 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.453996783 +0000 UTC m=+227.421412479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.954656 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-catalog-content\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.982659 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fjq8h"] Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.983366 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnv65\" (UniqueName: \"kubernetes.io/projected/a7eceeab-0758-4f0f-88c7-5b744fbab868-kube-api-access-rnv65\") pod \"redhat-marketplace-j5rmg\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:50 crc kubenswrapper[4703]: I0309 13:23:50.986110 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fjq8h"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.054569 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:51 crc kubenswrapper[4703]: E0309 13:23:51.055106 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.555094943 +0000 UTC m=+227.522510619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.080352 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.145894 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pqzw"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.150421 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.153414 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pqzw"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.156422 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:51 crc kubenswrapper[4703]: E0309 13:23:51.156676 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.656584806 +0000 UTC m=+227.624000502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.224034 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.258375 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-utilities\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.258460 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-catalog-content\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.258567 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.258591 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg75f\" (UniqueName: \"kubernetes.io/projected/3916d955-3b83-488d-959c-88b09424a3e3-kube-api-access-fg75f\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: E0309 13:23:51.258939 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.758919103 +0000 UTC m=+227.726334889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.360266 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.360444 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-utilities\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.360493 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-catalog-content\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.360547 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg75f\" (UniqueName: \"kubernetes.io/projected/3916d955-3b83-488d-959c-88b09424a3e3-kube-api-access-fg75f\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: E0309 13:23:51.360906 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.86088744 +0000 UTC m=+227.828303126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.361306 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-utilities\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.362305 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-catalog-content\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.378881 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg75f\" (UniqueName: \"kubernetes.io/projected/3916d955-3b83-488d-959c-88b09424a3e3-kube-api-access-fg75f\") pod \"redhat-marketplace-6pqzw\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.461645 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:51 crc kubenswrapper[4703]: E0309 13:23:51.461991 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 13:23:51.961978161 +0000 UTC m=+227.929393847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lddp9" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.481330 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.504778 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.507963 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.508470 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.510220 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.511054 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.511302 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.511721 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.515484 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.520505 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.523655 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.524656 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5rmg"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.567901 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:51 crc kubenswrapper[4703]: E0309 13:23:51.568419 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 13:23:52.068400869 +0000 UTC m=+228.035816555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.577989 4703 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T13:23:50.741298849Z","Handler":null,"Name":""} Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.587167 4703 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.587198 4703 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.613306 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:51 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:51 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:51 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.613733 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.670364 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6m4\" (UniqueName: \"kubernetes.io/projected/1d126787-db9b-4b15-92f7-26aa470f3d18-kube-api-access-st6m4\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.670413 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d126787-db9b-4b15-92f7-26aa470f3d18-serving-cert\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.670441 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-proxy-ca-bundles\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.670466 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-config\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.670492 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-client-ca\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.670532 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.673124 4703 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.673158 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.731360 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lddp9\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.764970 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m75f5"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.768420 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.769248 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m75f5"] Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.772772 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.772937 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d126787-db9b-4b15-92f7-26aa470f3d18-serving-cert\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.773102 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-proxy-ca-bundles\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.773643 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-config\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.773676 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-client-ca\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.773776 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6m4\" (UniqueName: \"kubernetes.io/projected/1d126787-db9b-4b15-92f7-26aa470f3d18-kube-api-access-st6m4\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.774998 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-client-ca\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.775110 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-proxy-ca-bundles\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.782321 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.783109 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d126787-db9b-4b15-92f7-26aa470f3d18-serving-cert\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.783929 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-config\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.796799 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.804754 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6m4\" (UniqueName: \"kubernetes.io/projected/1d126787-db9b-4b15-92f7-26aa470f3d18-kube-api-access-st6m4\") pod \"controller-manager-6cd86c7dd-dd9hk\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.875290 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqhz\" (UniqueName: \"kubernetes.io/projected/75d9c9d6-9701-4627-a87b-b2ae669a0eae-kube-api-access-sgqhz\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.875424 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-utilities\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.875458 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-catalog-content\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.894622 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.914835 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" event={"ID":"7dbf6b37-f8d9-4a20-84e0-e9b137a29db6","Type":"ContainerStarted","Data":"b89bae919a1a4cb799a4cb262368f79946bfd8184ab21615fd3375120c9a54ec"} Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.920488 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3dac915c-5ef5-4e57-ba83-afe822e716e9","Type":"ContainerStarted","Data":"5a1e0d0a13973af69b16c21bff2c3d57460aa629a99a7700cc916134db8131be"} Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.920545 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3dac915c-5ef5-4e57-ba83-afe822e716e9","Type":"ContainerStarted","Data":"413da524365bd27d543450840560a03e634cd6acbbf191c09dc1802b7794d218"} Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.920579 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.926462 4703 generic.go:334] "Generic (PLEG): container finished" podID="330bced9-c478-4eb6-8b4d-10d69e5a6965" containerID="c02a66815067f308ad8dd7aeb9a18255fe4ba2d03ae1b2ad0e708fc7d267f6b6" exitCode=0 Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.926518 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" event={"ID":"330bced9-c478-4eb6-8b4d-10d69e5a6965","Type":"ContainerDied","Data":"c02a66815067f308ad8dd7aeb9a18255fe4ba2d03ae1b2ad0e708fc7d267f6b6"} Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.936741 4703 generic.go:334] "Generic (PLEG): container finished" podID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerID="0d9dd789088be34cb1755b88473928c3cb0283e63e8e106061d6a83e515f5bed" exitCode=0 Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.936869 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5rmg" event={"ID":"a7eceeab-0758-4f0f-88c7-5b744fbab868","Type":"ContainerDied","Data":"0d9dd789088be34cb1755b88473928c3cb0283e63e8e106061d6a83e515f5bed"} Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.936907 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5rmg" event={"ID":"a7eceeab-0758-4f0f-88c7-5b744fbab868","Type":"ContainerStarted","Data":"83cfa815c4fc21adceb3e28d54ae3bbb23fa192e3af53bb135996f8ccd41e098"} Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.938780 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pjlkd" podStartSLOduration=12.938761963 podStartE2EDuration="12.938761963s" podCreationTimestamp="2026-03-09 13:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:51.935373602 +0000 UTC m=+227.902789308" watchObservedRunningTime="2026-03-09 13:23:51.938761963 +0000 UTC m=+227.906177649" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.979657 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.979640136 podStartE2EDuration="1.979640136s" podCreationTimestamp="2026-03-09 13:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:51.952792339 +0000 UTC m=+227.920208045" watchObservedRunningTime="2026-03-09 13:23:51.979640136 +0000 UTC m=+227.947055822" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.982294 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.982340 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.982480 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-utilities\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.982516 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-catalog-content\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.983026 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqhz\" (UniqueName: \"kubernetes.io/projected/75d9c9d6-9701-4627-a87b-b2ae669a0eae-kube-api-access-sgqhz\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.982965 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-utilities\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.985447 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-catalog-content\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.997541 4703 patch_prober.go:28] interesting pod/console-f9d7485db-jbdq8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 13:23:51 crc kubenswrapper[4703]: I0309 13:23:51.997617 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jbdq8" podUID="98d3f0fa-d5c6-4288-af8a-bdc0b29dab63" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.005032 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqhz\" (UniqueName: \"kubernetes.io/projected/75d9c9d6-9701-4627-a87b-b2ae669a0eae-kube-api-access-sgqhz\") pod \"redhat-operators-m75f5\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.032479 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pqzw"] Mar 09 13:23:52 crc kubenswrapper[4703]: W0309 13:23:52.040117 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3916d955_3b83_488d_959c_88b09424a3e3.slice/crio-2d20e1063bb57c11f6763c69cc818cdffc5be220dd8425835a0ae66ea41c0f5b WatchSource:0}: Error finding container 2d20e1063bb57c11f6763c69cc818cdffc5be220dd8425835a0ae66ea41c0f5b: Status 404 returned error can't find the container with id 2d20e1063bb57c11f6763c69cc818cdffc5be220dd8425835a0ae66ea41c0f5b Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.122876 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.150349 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gkg4j"] Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.151333 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.184753 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gkg4j"] Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.199276 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk"] Mar 09 13:23:52 crc kubenswrapper[4703]: W0309 13:23:52.234858 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d126787_db9b_4b15_92f7_26aa470f3d18.slice/crio-386c4b23696688b7dbc93b07d643c0dc0f0dc0e0ddb3914a7016ae824a4790ea WatchSource:0}: Error finding container 386c4b23696688b7dbc93b07d643c0dc0f0dc0e0ddb3914a7016ae824a4790ea: Status 404 returned error can't find the container with id 386c4b23696688b7dbc93b07d643c0dc0f0dc0e0ddb3914a7016ae824a4790ea Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.288458 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-utilities\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.288492 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-catalog-content\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.288535 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glzwf\" (UniqueName: \"kubernetes.io/projected/09a58d8e-1a70-4e6c-b217-1c2b2537accc-kube-api-access-glzwf\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.390290 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-utilities\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.390339 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-catalog-content\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.390395 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glzwf\" (UniqueName: \"kubernetes.io/projected/09a58d8e-1a70-4e6c-b217-1c2b2537accc-kube-api-access-glzwf\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.392012 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-utilities\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.392292 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-catalog-content\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.412211 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glzwf\" (UniqueName: \"kubernetes.io/projected/09a58d8e-1a70-4e6c-b217-1c2b2537accc-kube-api-access-glzwf\") pod \"redhat-operators-gkg4j\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.420127 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m75f5"] Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.477329 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.477371 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.477954 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lddp9"] Mar 09 13:23:52 crc kubenswrapper[4703]: W0309 13:23:52.483662 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d9c9d6_9701_4627_a87b_b2ae669a0eae.slice/crio-696c916c5c9c469aab1f5dbcfef840ba09b69ba8b4479fad8fd37157031f6d6e WatchSource:0}: Error finding container 696c916c5c9c469aab1f5dbcfef840ba09b69ba8b4479fad8fd37157031f6d6e: Status 404 returned error can't find the container with id 696c916c5c9c469aab1f5dbcfef840ba09b69ba8b4479fad8fd37157031f6d6e Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.485580 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.493921 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-vs8rs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.493974 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vs8rs" podUID="2b4dabbf-243f-4504-abc1-b34b4da6a25c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.494293 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-vs8rs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.494374 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vs8rs" podUID="2b4dabbf-243f-4504-abc1-b34b4da6a25c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.498036 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:23:52 crc kubenswrapper[4703]: W0309 13:23:52.534426 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbbe85f0_658d_442a_bf5e_5ab93c6ddbdd.slice/crio-b5a0a84afda82daca18826d95e91c3a56b8df030921d7cdfb63836f6adabc1a5 WatchSource:0}: Error finding container b5a0a84afda82daca18826d95e91c3a56b8df030921d7cdfb63836f6adabc1a5: Status 404 returned error can't find the container with id b5a0a84afda82daca18826d95e91c3a56b8df030921d7cdfb63836f6adabc1a5 Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.608293 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.610329 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:52 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:52 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:52 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.610389 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.730356 4703 patch_prober.go:28] interesting pod/apiserver-76f77b778f-k8tgg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]log ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]etcd ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/max-in-flight-filter ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 09 13:23:52 crc kubenswrapper[4703]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 09 13:23:52 crc kubenswrapper[4703]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/project.openshift.io-projectcache ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/openshift.io-startinformers ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 09 13:23:52 crc kubenswrapper[4703]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 13:23:52 crc kubenswrapper[4703]: livez check failed Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.730407 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" podUID="507f368b-bd66-4e0a-8e30-bc8505f3f76f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.731582 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.732580 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed696cd9-4261-4e65-a89d-17f918249fc9" path="/var/lib/kubelet/pods/ed696cd9-4261-4e65-a89d-17f918249fc9/volumes" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.807965 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:23:52 crc kubenswrapper[4703]: I0309 13:23:52.895523 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gkg4j"] Mar 09 13:23:52 crc kubenswrapper[4703]: W0309 13:23:52.928202 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a58d8e_1a70_4e6c_b217_1c2b2537accc.slice/crio-5966fc7db128b2e2c9e4ecbd8a1a85d3006725ee4dcc0eaea435151da504fdba WatchSource:0}: Error finding container 5966fc7db128b2e2c9e4ecbd8a1a85d3006725ee4dcc0eaea435151da504fdba: Status 404 returned error can't find the container with id 5966fc7db128b2e2c9e4ecbd8a1a85d3006725ee4dcc0eaea435151da504fdba Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.049914 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" event={"ID":"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd","Type":"ContainerStarted","Data":"c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.049970 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" event={"ID":"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd","Type":"ContainerStarted","Data":"b5a0a84afda82daca18826d95e91c3a56b8df030921d7cdfb63836f6adabc1a5"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.051014 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.066216 4703 generic.go:334] "Generic (PLEG): container finished" podID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerID="6a42eab526fe11f909b607ac0b4b97ff612c17e241d4f4a7b00bed5eff2595ad" exitCode=0 Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.066979 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m75f5" event={"ID":"75d9c9d6-9701-4627-a87b-b2ae669a0eae","Type":"ContainerDied","Data":"6a42eab526fe11f909b607ac0b4b97ff612c17e241d4f4a7b00bed5eff2595ad"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.067034 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m75f5" event={"ID":"75d9c9d6-9701-4627-a87b-b2ae669a0eae","Type":"ContainerStarted","Data":"696c916c5c9c469aab1f5dbcfef840ba09b69ba8b4479fad8fd37157031f6d6e"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.103105 4703 generic.go:334] "Generic (PLEG): container finished" podID="3916d955-3b83-488d-959c-88b09424a3e3" containerID="5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b" exitCode=0 Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.103223 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pqzw" event={"ID":"3916d955-3b83-488d-959c-88b09424a3e3","Type":"ContainerDied","Data":"5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.103253 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pqzw" event={"ID":"3916d955-3b83-488d-959c-88b09424a3e3","Type":"ContainerStarted","Data":"2d20e1063bb57c11f6763c69cc818cdffc5be220dd8425835a0ae66ea41c0f5b"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.130676 4703 generic.go:334] "Generic (PLEG): container finished" podID="3dac915c-5ef5-4e57-ba83-afe822e716e9" containerID="5a1e0d0a13973af69b16c21bff2c3d57460aa629a99a7700cc916134db8131be" exitCode=0 Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.130769 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3dac915c-5ef5-4e57-ba83-afe822e716e9","Type":"ContainerDied","Data":"5a1e0d0a13973af69b16c21bff2c3d57460aa629a99a7700cc916134db8131be"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.154858 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" podStartSLOduration=177.154826808 podStartE2EDuration="2m57.154826808s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:53.088207001 +0000 UTC m=+229.055622687" watchObservedRunningTime="2026-03-09 13:23:53.154826808 +0000 UTC m=+229.122242494" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.188000 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gkg4j" event={"ID":"09a58d8e-1a70-4e6c-b217-1c2b2537accc","Type":"ContainerStarted","Data":"5966fc7db128b2e2c9e4ecbd8a1a85d3006725ee4dcc0eaea435151da504fdba"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.198310 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" event={"ID":"1d126787-db9b-4b15-92f7-26aa470f3d18","Type":"ContainerStarted","Data":"8068c54332f85d11137a7af42b3e90bf127d53048424cc8186cde8ed075dcd04"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.198364 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" event={"ID":"1d126787-db9b-4b15-92f7-26aa470f3d18","Type":"ContainerStarted","Data":"386c4b23696688b7dbc93b07d643c0dc0f0dc0e0ddb3914a7016ae824a4790ea"} Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.199486 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.217792 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gqh49" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.227416 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.252796 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" podStartSLOduration=7.252781666 podStartE2EDuration="7.252781666s" podCreationTimestamp="2026-03-09 13:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:53.250506278 +0000 UTC m=+229.217921964" watchObservedRunningTime="2026-03-09 13:23:53.252781666 +0000 UTC m=+229.220197352" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.612664 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:53 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:53 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:53 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.612961 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.716949 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.717756 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.722128 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.722462 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.727349 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.735425 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.822938 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/330bced9-c478-4eb6-8b4d-10d69e5a6965-config-volume\") pod \"330bced9-c478-4eb6-8b4d-10d69e5a6965\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.823251 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/330bced9-c478-4eb6-8b4d-10d69e5a6965-secret-volume\") pod \"330bced9-c478-4eb6-8b4d-10d69e5a6965\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.823315 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hq8x\" (UniqueName: \"kubernetes.io/projected/330bced9-c478-4eb6-8b4d-10d69e5a6965-kube-api-access-2hq8x\") pod \"330bced9-c478-4eb6-8b4d-10d69e5a6965\" (UID: \"330bced9-c478-4eb6-8b4d-10d69e5a6965\") " Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.823535 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0faae95e-5842-4bb3-84b9-46067585d63f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.823597 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0faae95e-5842-4bb3-84b9-46067585d63f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.825381 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330bced9-c478-4eb6-8b4d-10d69e5a6965-config-volume" (OuterVolumeSpecName: "config-volume") pod "330bced9-c478-4eb6-8b4d-10d69e5a6965" (UID: "330bced9-c478-4eb6-8b4d-10d69e5a6965"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.838415 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/330bced9-c478-4eb6-8b4d-10d69e5a6965-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "330bced9-c478-4eb6-8b4d-10d69e5a6965" (UID: "330bced9-c478-4eb6-8b4d-10d69e5a6965"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.839472 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330bced9-c478-4eb6-8b4d-10d69e5a6965-kube-api-access-2hq8x" (OuterVolumeSpecName: "kube-api-access-2hq8x") pod "330bced9-c478-4eb6-8b4d-10d69e5a6965" (UID: "330bced9-c478-4eb6-8b4d-10d69e5a6965"). InnerVolumeSpecName "kube-api-access-2hq8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.924917 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0faae95e-5842-4bb3-84b9-46067585d63f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.925049 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0faae95e-5842-4bb3-84b9-46067585d63f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.925163 4703 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/330bced9-c478-4eb6-8b4d-10d69e5a6965-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.925201 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hq8x\" (UniqueName: \"kubernetes.io/projected/330bced9-c478-4eb6-8b4d-10d69e5a6965-kube-api-access-2hq8x\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.925215 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/330bced9-c478-4eb6-8b4d-10d69e5a6965-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.925695 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0faae95e-5842-4bb3-84b9-46067585d63f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:53 crc kubenswrapper[4703]: I0309 13:23:53.961605 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0faae95e-5842-4bb3-84b9-46067585d63f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.056234 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.214252 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" event={"ID":"330bced9-c478-4eb6-8b4d-10d69e5a6965","Type":"ContainerDied","Data":"30c24125a30470900945b60116c19d7cb0b565b8f2f3729c7ace7443062c69a1"} Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.214289 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30c24125a30470900945b60116c19d7cb0b565b8f2f3729c7ace7443062c69a1" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.214338 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551035-4mjnk" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.225592 4703 generic.go:334] "Generic (PLEG): container finished" podID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerID="551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed" exitCode=0 Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.227063 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gkg4j" event={"ID":"09a58d8e-1a70-4e6c-b217-1c2b2537accc","Type":"ContainerDied","Data":"551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed"} Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.387825 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.618954 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:54 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:54 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:54 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.619260 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.639400 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.759285 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dac915c-5ef5-4e57-ba83-afe822e716e9-kubelet-dir\") pod \"3dac915c-5ef5-4e57-ba83-afe822e716e9\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.759372 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dac915c-5ef5-4e57-ba83-afe822e716e9-kube-api-access\") pod \"3dac915c-5ef5-4e57-ba83-afe822e716e9\" (UID: \"3dac915c-5ef5-4e57-ba83-afe822e716e9\") " Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.761275 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3dac915c-5ef5-4e57-ba83-afe822e716e9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3dac915c-5ef5-4e57-ba83-afe822e716e9" (UID: "3dac915c-5ef5-4e57-ba83-afe822e716e9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.767090 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dac915c-5ef5-4e57-ba83-afe822e716e9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3dac915c-5ef5-4e57-ba83-afe822e716e9" (UID: "3dac915c-5ef5-4e57-ba83-afe822e716e9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.860360 4703 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dac915c-5ef5-4e57-ba83-afe822e716e9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:54 crc kubenswrapper[4703]: I0309 13:23:54.860391 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3dac915c-5ef5-4e57-ba83-afe822e716e9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.141474 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-trj95" Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.250241 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0faae95e-5842-4bb3-84b9-46067585d63f","Type":"ContainerStarted","Data":"6589c3bf989201be052a4f57a48a61ca2f984aae1e97d8036f13dbc2418d44c2"} Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.256527 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.264018 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3dac915c-5ef5-4e57-ba83-afe822e716e9","Type":"ContainerDied","Data":"413da524365bd27d543450840560a03e634cd6acbbf191c09dc1802b7794d218"} Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.264435 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413da524365bd27d543450840560a03e634cd6acbbf191c09dc1802b7794d218" Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.610585 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:55 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:55 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:55 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.610658 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:55 crc kubenswrapper[4703]: I0309 13:23:55.810300 4703 ???:1] "http: TLS handshake error from 192.168.126.11:35864: no serving certificate available for the kubelet" Mar 09 13:23:56 crc kubenswrapper[4703]: I0309 13:23:56.293365 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0faae95e-5842-4bb3-84b9-46067585d63f","Type":"ContainerStarted","Data":"d3bb6f017b6bdf61283ef9ab55fdde23051ff7c27ff8427791041772e719824a"} Mar 09 13:23:56 crc kubenswrapper[4703]: I0309 13:23:56.307265 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.307248609 podStartE2EDuration="3.307248609s" podCreationTimestamp="2026-03-09 13:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:23:56.304809756 +0000 UTC m=+232.272225442" watchObservedRunningTime="2026-03-09 13:23:56.307248609 +0000 UTC m=+232.274664295" Mar 09 13:23:56 crc kubenswrapper[4703]: I0309 13:23:56.609634 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:56 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:56 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:56 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:56 crc kubenswrapper[4703]: I0309 13:23:56.609701 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:57 crc kubenswrapper[4703]: I0309 13:23:57.325599 4703 generic.go:334] "Generic (PLEG): container finished" podID="0faae95e-5842-4bb3-84b9-46067585d63f" containerID="d3bb6f017b6bdf61283ef9ab55fdde23051ff7c27ff8427791041772e719824a" exitCode=0 Mar 09 13:23:57 crc kubenswrapper[4703]: I0309 13:23:57.325799 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0faae95e-5842-4bb3-84b9-46067585d63f","Type":"ContainerDied","Data":"d3bb6f017b6bdf61283ef9ab55fdde23051ff7c27ff8427791041772e719824a"} Mar 09 13:23:57 crc kubenswrapper[4703]: I0309 13:23:57.618711 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:57 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:57 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:57 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:57 crc kubenswrapper[4703]: I0309 13:23:57.618764 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:57 crc kubenswrapper[4703]: I0309 13:23:57.718300 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:57 crc kubenswrapper[4703]: I0309 13:23:57.723159 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-k8tgg" Mar 09 13:23:58 crc kubenswrapper[4703]: I0309 13:23:58.193816 4703 ???:1] "http: TLS handshake error from 192.168.126.11:35878: no serving certificate available for the kubelet" Mar 09 13:23:58 crc kubenswrapper[4703]: I0309 13:23:58.611301 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:58 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:58 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:58 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:58 crc kubenswrapper[4703]: I0309 13:23:58.611351 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:23:59 crc kubenswrapper[4703]: I0309 13:23:59.611371 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:23:59 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:23:59 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:23:59 crc kubenswrapper[4703]: healthz check failed Mar 09 13:23:59 crc kubenswrapper[4703]: I0309 13:23:59.611757 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.127300 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551044-rdp77"] Mar 09 13:24:00 crc kubenswrapper[4703]: E0309 13:24:00.127532 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dac915c-5ef5-4e57-ba83-afe822e716e9" containerName="pruner" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.127542 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dac915c-5ef5-4e57-ba83-afe822e716e9" containerName="pruner" Mar 09 13:24:00 crc kubenswrapper[4703]: E0309 13:24:00.127582 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330bced9-c478-4eb6-8b4d-10d69e5a6965" containerName="collect-profiles" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.127589 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="330bced9-c478-4eb6-8b4d-10d69e5a6965" containerName="collect-profiles" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.127712 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dac915c-5ef5-4e57-ba83-afe822e716e9" containerName="pruner" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.127747 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="330bced9-c478-4eb6-8b4d-10d69e5a6965" containerName="collect-profiles" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.128281 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-rdp77" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.135139 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-rdp77"] Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.135218 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.180563 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wql\" (UniqueName: \"kubernetes.io/projected/ba5e15b8-80e8-4d9c-83eb-12dd004ca901-kube-api-access-l9wql\") pod \"auto-csr-approver-29551044-rdp77\" (UID: \"ba5e15b8-80e8-4d9c-83eb-12dd004ca901\") " pod="openshift-infra/auto-csr-approver-29551044-rdp77" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.282000 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wql\" (UniqueName: \"kubernetes.io/projected/ba5e15b8-80e8-4d9c-83eb-12dd004ca901-kube-api-access-l9wql\") pod \"auto-csr-approver-29551044-rdp77\" (UID: \"ba5e15b8-80e8-4d9c-83eb-12dd004ca901\") " pod="openshift-infra/auto-csr-approver-29551044-rdp77" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.302489 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wql\" (UniqueName: \"kubernetes.io/projected/ba5e15b8-80e8-4d9c-83eb-12dd004ca901-kube-api-access-l9wql\") pod \"auto-csr-approver-29551044-rdp77\" (UID: \"ba5e15b8-80e8-4d9c-83eb-12dd004ca901\") " pod="openshift-infra/auto-csr-approver-29551044-rdp77" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.446486 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-rdp77" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.610686 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:00 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:00 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:00 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.610766 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.687967 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.691363 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/967e7a44-ac71-42aa-9847-37799ff35cc0-metrics-certs\") pod \"network-metrics-daemon-jlgk5\" (UID: \"967e7a44-ac71-42aa-9847-37799ff35cc0\") " pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:24:00 crc kubenswrapper[4703]: I0309 13:24:00.730224 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jlgk5" Mar 09 13:24:01 crc kubenswrapper[4703]: I0309 13:24:01.608995 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:01 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:01 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:01 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:01 crc kubenswrapper[4703]: I0309 13:24:01.609404 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:01 crc kubenswrapper[4703]: I0309 13:24:01.987253 4703 patch_prober.go:28] interesting pod/console-f9d7485db-jbdq8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 13:24:01 crc kubenswrapper[4703]: I0309 13:24:01.987577 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jbdq8" podUID="98d3f0fa-d5c6-4288-af8a-bdc0b29dab63" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 13:24:02 crc kubenswrapper[4703]: I0309 13:24:02.493397 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-vs8rs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 13:24:02 crc kubenswrapper[4703]: I0309 13:24:02.493454 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vs8rs" podUID="2b4dabbf-243f-4504-abc1-b34b4da6a25c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 13:24:02 crc kubenswrapper[4703]: I0309 13:24:02.493467 4703 patch_prober.go:28] interesting pod/downloads-7954f5f757-vs8rs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 13:24:02 crc kubenswrapper[4703]: I0309 13:24:02.493519 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vs8rs" podUID="2b4dabbf-243f-4504-abc1-b34b4da6a25c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 13:24:02 crc kubenswrapper[4703]: I0309 13:24:02.609707 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:02 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:02 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:02 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:02 crc kubenswrapper[4703]: I0309 13:24:02.609774 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:03 crc kubenswrapper[4703]: I0309 13:24:03.609064 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:03 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:03 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:03 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:03 crc kubenswrapper[4703]: I0309 13:24:03.609159 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:04 crc kubenswrapper[4703]: I0309 13:24:04.609653 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:04 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:04 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:04 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:04 crc kubenswrapper[4703]: I0309 13:24:04.609713 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:05 crc kubenswrapper[4703]: I0309 13:24:05.613445 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:05 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:05 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:05 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:05 crc kubenswrapper[4703]: I0309 13:24:05.613835 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:05 crc kubenswrapper[4703]: I0309 13:24:05.681446 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk"] Mar 09 13:24:05 crc kubenswrapper[4703]: I0309 13:24:05.681772 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" podUID="1d126787-db9b-4b15-92f7-26aa470f3d18" containerName="controller-manager" containerID="cri-o://8068c54332f85d11137a7af42b3e90bf127d53048424cc8186cde8ed075dcd04" gracePeriod=30 Mar 09 13:24:05 crc kubenswrapper[4703]: I0309 13:24:05.690027 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5"] Mar 09 13:24:05 crc kubenswrapper[4703]: I0309 13:24:05.690303 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" podUID="e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" containerName="route-controller-manager" containerID="cri-o://cad6b47c96f1e6cfd1ea7d10f7acba50659d996a63dbff63c0eb08e129147695" gracePeriod=30 Mar 09 13:24:06 crc kubenswrapper[4703]: I0309 13:24:06.086501 4703 ???:1] "http: TLS handshake error from 192.168.126.11:54528: no serving certificate available for the kubelet" Mar 09 13:24:06 crc kubenswrapper[4703]: I0309 13:24:06.394658 4703 generic.go:334] "Generic (PLEG): container finished" podID="e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" containerID="cad6b47c96f1e6cfd1ea7d10f7acba50659d996a63dbff63c0eb08e129147695" exitCode=0 Mar 09 13:24:06 crc kubenswrapper[4703]: I0309 13:24:06.394728 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" event={"ID":"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32","Type":"ContainerDied","Data":"cad6b47c96f1e6cfd1ea7d10f7acba50659d996a63dbff63c0eb08e129147695"} Mar 09 13:24:06 crc kubenswrapper[4703]: I0309 13:24:06.397034 4703 generic.go:334] "Generic (PLEG): container finished" podID="1d126787-db9b-4b15-92f7-26aa470f3d18" containerID="8068c54332f85d11137a7af42b3e90bf127d53048424cc8186cde8ed075dcd04" exitCode=0 Mar 09 13:24:06 crc kubenswrapper[4703]: I0309 13:24:06.397072 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" event={"ID":"1d126787-db9b-4b15-92f7-26aa470f3d18","Type":"ContainerDied","Data":"8068c54332f85d11137a7af42b3e90bf127d53048424cc8186cde8ed075dcd04"} Mar 09 13:24:06 crc kubenswrapper[4703]: I0309 13:24:06.610273 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:06 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:06 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:06 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:06 crc kubenswrapper[4703]: I0309 13:24:06.610348 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:07 crc kubenswrapper[4703]: I0309 13:24:07.611151 4703 patch_prober.go:28] interesting pod/router-default-5444994796-nzj56 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 13:24:07 crc kubenswrapper[4703]: [-]has-synced failed: reason withheld Mar 09 13:24:07 crc kubenswrapper[4703]: [+]process-running ok Mar 09 13:24:07 crc kubenswrapper[4703]: healthz check failed Mar 09 13:24:07 crc kubenswrapper[4703]: I0309 13:24:07.611504 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzj56" podUID="7b6068d6-17d2-4802-8778-e1ab076da652" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 13:24:08 crc kubenswrapper[4703]: I0309 13:24:08.612398 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:24:08 crc kubenswrapper[4703]: I0309 13:24:08.619150 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nzj56" Mar 09 13:24:09 crc kubenswrapper[4703]: I0309 13:24:09.500107 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:24:09 crc kubenswrapper[4703]: I0309 13:24:09.500374 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:24:09 crc kubenswrapper[4703]: I0309 13:24:09.837534 4703 patch_prober.go:28] interesting pod/route-controller-manager-77dd4cd8f8-25pv5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 09 13:24:09 crc kubenswrapper[4703]: I0309 13:24:09.837926 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" podUID="e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.802143 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.868971 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0faae95e-5842-4bb3-84b9-46067585d63f-kube-api-access\") pod \"0faae95e-5842-4bb3-84b9-46067585d63f\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.869385 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0faae95e-5842-4bb3-84b9-46067585d63f-kubelet-dir\") pod \"0faae95e-5842-4bb3-84b9-46067585d63f\" (UID: \"0faae95e-5842-4bb3-84b9-46067585d63f\") " Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.871281 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0faae95e-5842-4bb3-84b9-46067585d63f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0faae95e-5842-4bb3-84b9-46067585d63f" (UID: "0faae95e-5842-4bb3-84b9-46067585d63f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.874232 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0faae95e-5842-4bb3-84b9-46067585d63f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0faae95e-5842-4bb3-84b9-46067585d63f" (UID: "0faae95e-5842-4bb3-84b9-46067585d63f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.895887 4703 patch_prober.go:28] interesting pod/controller-manager-6cd86c7dd-dd9hk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.895946 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" podUID="1d126787-db9b-4b15-92f7-26aa470f3d18" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.928569 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.970948 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0faae95e-5842-4bb3-84b9-46067585d63f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.970983 4703 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0faae95e-5842-4bb3-84b9-46067585d63f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.993981 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:24:11 crc kubenswrapper[4703]: I0309 13:24:11.999059 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jbdq8" Mar 09 13:24:12 crc kubenswrapper[4703]: I0309 13:24:12.444721 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 13:24:12 crc kubenswrapper[4703]: I0309 13:24:12.444719 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0faae95e-5842-4bb3-84b9-46067585d63f","Type":"ContainerDied","Data":"6589c3bf989201be052a4f57a48a61ca2f984aae1e97d8036f13dbc2418d44c2"} Mar 09 13:24:12 crc kubenswrapper[4703]: I0309 13:24:12.444789 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6589c3bf989201be052a4f57a48a61ca2f984aae1e97d8036f13dbc2418d44c2" Mar 09 13:24:12 crc kubenswrapper[4703]: I0309 13:24:12.499726 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vs8rs" Mar 09 13:24:16 crc kubenswrapper[4703]: E0309 13:24:16.355815 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 13:24:16 crc kubenswrapper[4703]: E0309 13:24:16.356194 4703 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 13:24:16 crc kubenswrapper[4703]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 13:24:16 crc kubenswrapper[4703]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h2mmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29551042-fv4ms_openshift-infra(64728a68-4675-4652-a800-7f055197862b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 13:24:16 crc kubenswrapper[4703]: > logger="UnhandledError" Mar 09 13:24:16 crc kubenswrapper[4703]: E0309 13:24:16.357375 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" podUID="64728a68-4675-4652-a800-7f055197862b" Mar 09 13:24:16 crc kubenswrapper[4703]: E0309 13:24:16.473024 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" podUID="64728a68-4675-4652-a800-7f055197862b" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.753172 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.753337 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.802831 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6"] Mar 09 13:24:17 crc kubenswrapper[4703]: E0309 13:24:17.803225 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d126787-db9b-4b15-92f7-26aa470f3d18" containerName="controller-manager" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.803254 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d126787-db9b-4b15-92f7-26aa470f3d18" containerName="controller-manager" Mar 09 13:24:17 crc kubenswrapper[4703]: E0309 13:24:17.803278 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" containerName="route-controller-manager" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.803289 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" containerName="route-controller-manager" Mar 09 13:24:17 crc kubenswrapper[4703]: E0309 13:24:17.803303 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0faae95e-5842-4bb3-84b9-46067585d63f" containerName="pruner" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.803316 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0faae95e-5842-4bb3-84b9-46067585d63f" containerName="pruner" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.803462 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" containerName="route-controller-manager" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.803490 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d126787-db9b-4b15-92f7-26aa470f3d18" containerName="controller-manager" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.803505 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="0faae95e-5842-4bb3-84b9-46067585d63f" containerName="pruner" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.804206 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.819432 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6"] Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855167 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-serving-cert\") pod \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855221 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-client-ca\") pod \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855273 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-proxy-ca-bundles\") pod \"1d126787-db9b-4b15-92f7-26aa470f3d18\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855325 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d126787-db9b-4b15-92f7-26aa470f3d18-serving-cert\") pod \"1d126787-db9b-4b15-92f7-26aa470f3d18\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855362 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st6m4\" (UniqueName: \"kubernetes.io/projected/1d126787-db9b-4b15-92f7-26aa470f3d18-kube-api-access-st6m4\") pod \"1d126787-db9b-4b15-92f7-26aa470f3d18\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855402 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-config\") pod \"1d126787-db9b-4b15-92f7-26aa470f3d18\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855448 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-config\") pod \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855495 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-client-ca\") pod \"1d126787-db9b-4b15-92f7-26aa470f3d18\" (UID: \"1d126787-db9b-4b15-92f7-26aa470f3d18\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.855523 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z8mq\" (UniqueName: \"kubernetes.io/projected/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-kube-api-access-7z8mq\") pod \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\" (UID: \"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32\") " Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.856367 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d126787-db9b-4b15-92f7-26aa470f3d18" (UID: "1d126787-db9b-4b15-92f7-26aa470f3d18"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.856398 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1d126787-db9b-4b15-92f7-26aa470f3d18" (UID: "1d126787-db9b-4b15-92f7-26aa470f3d18"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.856944 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-config" (OuterVolumeSpecName: "config") pod "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" (UID: "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.857071 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-config" (OuterVolumeSpecName: "config") pod "1d126787-db9b-4b15-92f7-26aa470f3d18" (UID: "1d126787-db9b-4b15-92f7-26aa470f3d18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.857640 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" (UID: "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.861195 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" (UID: "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.861227 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d126787-db9b-4b15-92f7-26aa470f3d18-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d126787-db9b-4b15-92f7-26aa470f3d18" (UID: "1d126787-db9b-4b15-92f7-26aa470f3d18"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.861327 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d126787-db9b-4b15-92f7-26aa470f3d18-kube-api-access-st6m4" (OuterVolumeSpecName: "kube-api-access-st6m4") pod "1d126787-db9b-4b15-92f7-26aa470f3d18" (UID: "1d126787-db9b-4b15-92f7-26aa470f3d18"). InnerVolumeSpecName "kube-api-access-st6m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.863484 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-kube-api-access-7z8mq" (OuterVolumeSpecName: "kube-api-access-7z8mq") pod "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" (UID: "e0dcd249-cdd6-4b3c-865c-2a105bbcfb32"). InnerVolumeSpecName "kube-api-access-7z8mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956464 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kf2\" (UniqueName: \"kubernetes.io/projected/be244c0d-2805-4b7c-8c21-d895e2d037f0-kube-api-access-q2kf2\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956551 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-config\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956581 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be244c0d-2805-4b7c-8c21-d895e2d037f0-serving-cert\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956608 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-client-ca\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956682 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956698 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956711 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956724 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z8mq\" (UniqueName: \"kubernetes.io/projected/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-kube-api-access-7z8mq\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956736 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956748 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956760 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1d126787-db9b-4b15-92f7-26aa470f3d18-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956773 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d126787-db9b-4b15-92f7-26aa470f3d18-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:17 crc kubenswrapper[4703]: I0309 13:24:17.956785 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st6m4\" (UniqueName: \"kubernetes.io/projected/1d126787-db9b-4b15-92f7-26aa470f3d18-kube-api-access-st6m4\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.058271 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kf2\" (UniqueName: \"kubernetes.io/projected/be244c0d-2805-4b7c-8c21-d895e2d037f0-kube-api-access-q2kf2\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.058338 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-config\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.058366 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be244c0d-2805-4b7c-8c21-d895e2d037f0-serving-cert\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.058395 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-client-ca\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.059447 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-client-ca\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.059712 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-config\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.063141 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be244c0d-2805-4b7c-8c21-d895e2d037f0-serving-cert\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.086715 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kf2\" (UniqueName: \"kubernetes.io/projected/be244c0d-2805-4b7c-8c21-d895e2d037f0-kube-api-access-q2kf2\") pod \"route-controller-manager-6d778f4477-zz4q6\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.131394 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.484273 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" event={"ID":"e0dcd249-cdd6-4b3c-865c-2a105bbcfb32","Type":"ContainerDied","Data":"f1058e9a3c339061f156d3668ce821e1daa0253f20b9550d7dc48eae36bafc2e"} Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.484300 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.484335 4703 scope.go:117] "RemoveContainer" containerID="cad6b47c96f1e6cfd1ea7d10f7acba50659d996a63dbff63c0eb08e129147695" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.486526 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" event={"ID":"1d126787-db9b-4b15-92f7-26aa470f3d18","Type":"ContainerDied","Data":"386c4b23696688b7dbc93b07d643c0dc0f0dc0e0ddb3914a7016ae824a4790ea"} Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.486609 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.520605 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk"] Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.524034 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cd86c7dd-dd9hk"] Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.531317 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5"] Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.533713 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77dd4cd8f8-25pv5"] Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.713811 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d126787-db9b-4b15-92f7-26aa470f3d18" path="/var/lib/kubelet/pods/1d126787-db9b-4b15-92f7-26aa470f3d18/volumes" Mar 09 13:24:18 crc kubenswrapper[4703]: I0309 13:24:18.714471 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0dcd249-cdd6-4b3c-865c-2a105bbcfb32" path="/var/lib/kubelet/pods/e0dcd249-cdd6-4b3c-865c-2a105bbcfb32/volumes" Mar 09 13:24:20 crc kubenswrapper[4703]: I0309 13:24:20.850680 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jlgk5"] Mar 09 13:24:20 crc kubenswrapper[4703]: I0309 13:24:20.881392 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-rdp77"] Mar 09 13:24:22 crc kubenswrapper[4703]: E0309 13:24:22.398315 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:24:22 crc kubenswrapper[4703]: E0309 13:24:22.398905 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mwqvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9gzfj_openshift-marketplace(7c0fe86d-3006-44b3-811a-f28758ef07fd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:22 crc kubenswrapper[4703]: E0309 13:24:22.400161 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9gzfj" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.524128 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-759d786675-njnpd"] Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.524915 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.526488 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.528120 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.528202 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.528281 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.528360 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.528539 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.535787 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.541388 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-759d786675-njnpd"] Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.555880 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.620941 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98e2c5bf-bd56-48af-bf65-e095f4d3767e-serving-cert\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.621014 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-config\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.621052 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-proxy-ca-bundles\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.621079 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-client-ca\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.621147 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nmx9\" (UniqueName: \"kubernetes.io/projected/98e2c5bf-bd56-48af-bf65-e095f4d3767e-kube-api-access-7nmx9\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.722709 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nmx9\" (UniqueName: \"kubernetes.io/projected/98e2c5bf-bd56-48af-bf65-e095f4d3767e-kube-api-access-7nmx9\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.722792 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98e2c5bf-bd56-48af-bf65-e095f4d3767e-serving-cert\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.722827 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-config\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.722860 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-proxy-ca-bundles\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.722880 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-client-ca\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.723631 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-client-ca\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.725215 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-proxy-ca-bundles\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.725793 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-config\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.739648 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98e2c5bf-bd56-48af-bf65-e095f4d3767e-serving-cert\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.741013 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nmx9\" (UniqueName: \"kubernetes.io/projected/98e2c5bf-bd56-48af-bf65-e095f4d3767e-kube-api-access-7nmx9\") pod \"controller-manager-759d786675-njnpd\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:22 crc kubenswrapper[4703]: I0309 13:24:22.848952 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:23 crc kubenswrapper[4703]: I0309 13:24:23.057548 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knp8m" Mar 09 13:24:24 crc kubenswrapper[4703]: E0309 13:24:24.236122 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 13:24:24 crc kubenswrapper[4703]: E0309 13:24:24.236401 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fskpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zc87r_openshift-marketplace(0ab69d10-042e-422a-830e-65d3d1132197): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:24 crc kubenswrapper[4703]: E0309 13:24:24.237631 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zc87r" podUID="0ab69d10-042e-422a-830e-65d3d1132197" Mar 09 13:24:25 crc kubenswrapper[4703]: E0309 13:24:25.583656 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 13:24:25 crc kubenswrapper[4703]: E0309 13:24:25.584179 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rnv65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j5rmg_openshift-marketplace(a7eceeab-0758-4f0f-88c7-5b744fbab868): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:25 crc kubenswrapper[4703]: E0309 13:24:25.585698 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j5rmg" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" Mar 09 13:24:25 crc kubenswrapper[4703]: E0309 13:24:25.622330 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 13:24:25 crc kubenswrapper[4703]: E0309 13:24:25.622478 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6gttc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9tq2q_openshift-marketplace(adee29a3-c626-4dc3-8323-ce1e852faea3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:25 crc kubenswrapper[4703]: E0309 13:24:25.623683 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9tq2q" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" Mar 09 13:24:25 crc kubenswrapper[4703]: I0309 13:24:25.657827 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-759d786675-njnpd"] Mar 09 13:24:25 crc kubenswrapper[4703]: I0309 13:24:25.756757 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6"] Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.499254 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.500312 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.503689 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.503999 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.510869 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.597534 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2008051d-eda2-484e-815f-d8bd152eaa6f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.597743 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2008051d-eda2-484e-815f-d8bd152eaa6f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.699198 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2008051d-eda2-484e-815f-d8bd152eaa6f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.699273 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2008051d-eda2-484e-815f-d8bd152eaa6f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.699324 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2008051d-eda2-484e-815f-d8bd152eaa6f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.717556 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2008051d-eda2-484e-815f-d8bd152eaa6f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:27 crc kubenswrapper[4703]: I0309 13:24:27.825253 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:30 crc kubenswrapper[4703]: I0309 13:24:30.269175 4703 scope.go:117] "RemoveContainer" containerID="8068c54332f85d11137a7af42b3e90bf127d53048424cc8186cde8ed075dcd04" Mar 09 13:24:30 crc kubenswrapper[4703]: W0309 13:24:30.273526 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967e7a44_ac71_42aa_9847_37799ff35cc0.slice/crio-cf9a5f3206209f8b846f31f1ee7916ecbfad0dda196eff4dad0446717fec357d WatchSource:0}: Error finding container cf9a5f3206209f8b846f31f1ee7916ecbfad0dda196eff4dad0446717fec357d: Status 404 returned error can't find the container with id cf9a5f3206209f8b846f31f1ee7916ecbfad0dda196eff4dad0446717fec357d Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.275270 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9gzfj" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.275880 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zc87r" podUID="0ab69d10-042e-422a-830e-65d3d1132197" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.277349 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j5rmg" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.277405 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9tq2q" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.443815 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.444095 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgqhz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-m75f5_openshift-marketplace(75d9c9d6-9701-4627-a87b-b2ae669a0eae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.445429 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-m75f5" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.528820 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.528974 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glzwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gkg4j_openshift-marketplace(09a58d8e-1a70-4e6c-b217-1c2b2537accc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.530080 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gkg4j" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" Mar 09 13:24:30 crc kubenswrapper[4703]: I0309 13:24:30.543317 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-rdp77" event={"ID":"ba5e15b8-80e8-4d9c-83eb-12dd004ca901","Type":"ContainerStarted","Data":"0af3864c2cde18e1ed3b5035508cf3132264890774db509235e40f6d605d4f3a"} Mar 09 13:24:30 crc kubenswrapper[4703]: I0309 13:24:30.544296 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" event={"ID":"967e7a44-ac71-42aa-9847-37799ff35cc0","Type":"ContainerStarted","Data":"cf9a5f3206209f8b846f31f1ee7916ecbfad0dda196eff4dad0446717fec357d"} Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.548559 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gkg4j" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" Mar 09 13:24:30 crc kubenswrapper[4703]: E0309 13:24:30.551753 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-m75f5" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" Mar 09 13:24:30 crc kubenswrapper[4703]: I0309 13:24:30.574918 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-759d786675-njnpd"] Mar 09 13:24:30 crc kubenswrapper[4703]: W0309 13:24:30.583039 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98e2c5bf_bd56_48af_bf65_e095f4d3767e.slice/crio-9edbf2988ba692104508d60a33ce2c2f0582882185ac0a353cbfdc94e6e85e01 WatchSource:0}: Error finding container 9edbf2988ba692104508d60a33ce2c2f0582882185ac0a353cbfdc94e6e85e01: Status 404 returned error can't find the container with id 9edbf2988ba692104508d60a33ce2c2f0582882185ac0a353cbfdc94e6e85e01 Mar 09 13:24:30 crc kubenswrapper[4703]: I0309 13:24:30.696284 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 13:24:30 crc kubenswrapper[4703]: I0309 13:24:30.701126 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6"] Mar 09 13:24:30 crc kubenswrapper[4703]: W0309 13:24:30.762975 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2008051d_eda2_484e_815f_d8bd152eaa6f.slice/crio-ed46b89519cdc573fd723af26e904c199b55a42f43fe2fc88076c6f65ef36c6b WatchSource:0}: Error finding container ed46b89519cdc573fd723af26e904c199b55a42f43fe2fc88076c6f65ef36c6b: Status 404 returned error can't find the container with id ed46b89519cdc573fd723af26e904c199b55a42f43fe2fc88076c6f65ef36c6b Mar 09 13:24:30 crc kubenswrapper[4703]: W0309 13:24:30.765746 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe244c0d_2805_4b7c_8c21_d895e2d037f0.slice/crio-0f34f11990115b645cf14480c79c23366d7422ed40a244171b5502463bfa89ae WatchSource:0}: Error finding container 0f34f11990115b645cf14480c79c23366d7422ed40a244171b5502463bfa89ae: Status 404 returned error can't find the container with id 0f34f11990115b645cf14480c79c23366d7422ed40a244171b5502463bfa89ae Mar 09 13:24:31 crc kubenswrapper[4703]: I0309 13:24:31.554498 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" event={"ID":"be244c0d-2805-4b7c-8c21-d895e2d037f0","Type":"ContainerStarted","Data":"0f34f11990115b645cf14480c79c23366d7422ed40a244171b5502463bfa89ae"} Mar 09 13:24:31 crc kubenswrapper[4703]: I0309 13:24:31.555634 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2008051d-eda2-484e-815f-d8bd152eaa6f","Type":"ContainerStarted","Data":"ed46b89519cdc573fd723af26e904c199b55a42f43fe2fc88076c6f65ef36c6b"} Mar 09 13:24:31 crc kubenswrapper[4703]: I0309 13:24:31.556718 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" event={"ID":"98e2c5bf-bd56-48af-bf65-e095f4d3767e","Type":"ContainerStarted","Data":"9edbf2988ba692104508d60a33ce2c2f0582882185ac0a353cbfdc94e6e85e01"} Mar 09 13:24:31 crc kubenswrapper[4703]: I0309 13:24:31.558183 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" event={"ID":"967e7a44-ac71-42aa-9847-37799ff35cc0","Type":"ContainerStarted","Data":"cc5f3c779aef31920f902971b26757ade172974052b2fc892496af49f2c1c243"} Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.147675 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.147814 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fg75f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6pqzw_openshift-marketplace(3916d955-3b83-488d-959c-88b09424a3e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.149020 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6pqzw" podUID="3916d955-3b83-488d-959c-88b09424a3e3" Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.523194 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.523698 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bf8nd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kvftn_openshift-marketplace(174b5189-1afe-40a3-813b-052dd29ad296): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.524883 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kvftn" podUID="174b5189-1afe-40a3-813b-052dd29ad296" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.563975 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" event={"ID":"98e2c5bf-bd56-48af-bf65-e095f4d3767e","Type":"ContainerStarted","Data":"4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c"} Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.565090 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" podUID="98e2c5bf-bd56-48af-bf65-e095f4d3767e" containerName="controller-manager" containerID="cri-o://4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c" gracePeriod=30 Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.565175 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.567273 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jlgk5" event={"ID":"967e7a44-ac71-42aa-9847-37799ff35cc0","Type":"ContainerStarted","Data":"846ae745bb2f99390b2a9e5e7e52b3159358f09594443e0821ead224b825d53f"} Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.571933 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" event={"ID":"be244c0d-2805-4b7c-8c21-d895e2d037f0","Type":"ContainerStarted","Data":"7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b"} Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.572041 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" podUID="be244c0d-2805-4b7c-8c21-d895e2d037f0" containerName="route-controller-manager" containerID="cri-o://7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b" gracePeriod=30 Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.572392 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.575307 4703 generic.go:334] "Generic (PLEG): container finished" podID="2008051d-eda2-484e-815f-d8bd152eaa6f" containerID="5aab5768122cfd85db9ec1419697794ebb6efea4faae071e79c5b6c6374dea95" exitCode=0 Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.575393 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2008051d-eda2-484e-815f-d8bd152eaa6f","Type":"ContainerDied","Data":"5aab5768122cfd85db9ec1419697794ebb6efea4faae071e79c5b6c6374dea95"} Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.578222 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6pqzw" podUID="3916d955-3b83-488d-959c-88b09424a3e3" Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.578401 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kvftn" podUID="174b5189-1afe-40a3-813b-052dd29ad296" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.578717 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.585196 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.590620 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" podStartSLOduration=27.590604488 podStartE2EDuration="27.590604488s" podCreationTimestamp="2026-03-09 13:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:32.589812915 +0000 UTC m=+268.557228601" watchObservedRunningTime="2026-03-09 13:24:32.590604488 +0000 UTC m=+268.558020174" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.639948 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jlgk5" podStartSLOduration=216.639923073 podStartE2EDuration="3m36.639923073s" podCreationTimestamp="2026-03-09 13:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:32.616364258 +0000 UTC m=+268.583779954" watchObservedRunningTime="2026-03-09 13:24:32.639923073 +0000 UTC m=+268.607338749" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.656736 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" podStartSLOduration=27.656717228 podStartE2EDuration="27.656717228s" podCreationTimestamp="2026-03-09 13:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:32.653793762 +0000 UTC m=+268.621209448" watchObservedRunningTime="2026-03-09 13:24:32.656717228 +0000 UTC m=+268.624132914" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.951196 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.974675 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7"] Mar 09 13:24:32 crc kubenswrapper[4703]: E0309 13:24:32.974894 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be244c0d-2805-4b7c-8c21-d895e2d037f0" containerName="route-controller-manager" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.974907 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="be244c0d-2805-4b7c-8c21-d895e2d037f0" containerName="route-controller-manager" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.975014 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="be244c0d-2805-4b7c-8c21-d895e2d037f0" containerName="route-controller-manager" Mar 09 13:24:32 crc kubenswrapper[4703]: I0309 13:24:32.975385 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.002230 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7"] Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.066532 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.075817 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2kf2\" (UniqueName: \"kubernetes.io/projected/be244c0d-2805-4b7c-8c21-d895e2d037f0-kube-api-access-q2kf2\") pod \"be244c0d-2805-4b7c-8c21-d895e2d037f0\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.075871 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-client-ca\") pod \"be244c0d-2805-4b7c-8c21-d895e2d037f0\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.075893 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be244c0d-2805-4b7c-8c21-d895e2d037f0-serving-cert\") pod \"be244c0d-2805-4b7c-8c21-d895e2d037f0\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.075926 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-config\") pod \"be244c0d-2805-4b7c-8c21-d895e2d037f0\" (UID: \"be244c0d-2805-4b7c-8c21-d895e2d037f0\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.076157 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-config\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.076205 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-serving-cert\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.076233 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-client-ca\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.076264 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hph4v\" (UniqueName: \"kubernetes.io/projected/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-kube-api-access-hph4v\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.076820 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "be244c0d-2805-4b7c-8c21-d895e2d037f0" (UID: "be244c0d-2805-4b7c-8c21-d895e2d037f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.076917 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-config" (OuterVolumeSpecName: "config") pod "be244c0d-2805-4b7c-8c21-d895e2d037f0" (UID: "be244c0d-2805-4b7c-8c21-d895e2d037f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.085309 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be244c0d-2805-4b7c-8c21-d895e2d037f0-kube-api-access-q2kf2" (OuterVolumeSpecName: "kube-api-access-q2kf2") pod "be244c0d-2805-4b7c-8c21-d895e2d037f0" (UID: "be244c0d-2805-4b7c-8c21-d895e2d037f0"). InnerVolumeSpecName "kube-api-access-q2kf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.088305 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be244c0d-2805-4b7c-8c21-d895e2d037f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "be244c0d-2805-4b7c-8c21-d895e2d037f0" (UID: "be244c0d-2805-4b7c-8c21-d895e2d037f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177116 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-client-ca\") pod \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177184 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98e2c5bf-bd56-48af-bf65-e095f4d3767e-serving-cert\") pod \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177214 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-config\") pod \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177237 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nmx9\" (UniqueName: \"kubernetes.io/projected/98e2c5bf-bd56-48af-bf65-e095f4d3767e-kube-api-access-7nmx9\") pod \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177258 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-proxy-ca-bundles\") pod \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\" (UID: \"98e2c5bf-bd56-48af-bf65-e095f4d3767e\") " Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177455 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-serving-cert\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177485 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-client-ca\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177514 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hph4v\" (UniqueName: \"kubernetes.io/projected/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-kube-api-access-hph4v\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177552 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-config\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177585 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2kf2\" (UniqueName: \"kubernetes.io/projected/be244c0d-2805-4b7c-8c21-d895e2d037f0-kube-api-access-q2kf2\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177595 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177605 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be244c0d-2805-4b7c-8c21-d895e2d037f0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.177614 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be244c0d-2805-4b7c-8c21-d895e2d037f0-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.178023 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-client-ca" (OuterVolumeSpecName: "client-ca") pod "98e2c5bf-bd56-48af-bf65-e095f4d3767e" (UID: "98e2c5bf-bd56-48af-bf65-e095f4d3767e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.178409 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "98e2c5bf-bd56-48af-bf65-e095f4d3767e" (UID: "98e2c5bf-bd56-48af-bf65-e095f4d3767e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.178982 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-client-ca\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.179036 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-config" (OuterVolumeSpecName: "config") pod "98e2c5bf-bd56-48af-bf65-e095f4d3767e" (UID: "98e2c5bf-bd56-48af-bf65-e095f4d3767e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.179557 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-config\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.181289 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e2c5bf-bd56-48af-bf65-e095f4d3767e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "98e2c5bf-bd56-48af-bf65-e095f4d3767e" (UID: "98e2c5bf-bd56-48af-bf65-e095f4d3767e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.181894 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-serving-cert\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.181983 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e2c5bf-bd56-48af-bf65-e095f4d3767e-kube-api-access-7nmx9" (OuterVolumeSpecName: "kube-api-access-7nmx9") pod "98e2c5bf-bd56-48af-bf65-e095f4d3767e" (UID: "98e2c5bf-bd56-48af-bf65-e095f4d3767e"). InnerVolumeSpecName "kube-api-access-7nmx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.197134 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hph4v\" (UniqueName: \"kubernetes.io/projected/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-kube-api-access-hph4v\") pod \"route-controller-manager-bf5d85c49-7fgv7\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.279268 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.279542 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98e2c5bf-bd56-48af-bf65-e095f4d3767e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.279555 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.279568 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nmx9\" (UniqueName: \"kubernetes.io/projected/98e2c5bf-bd56-48af-bf65-e095f4d3767e-kube-api-access-7nmx9\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.279581 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98e2c5bf-bd56-48af-bf65-e095f4d3767e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.308227 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.558760 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7"] Mar 09 13:24:33 crc kubenswrapper[4703]: W0309 13:24:33.568822 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d2d7f1e_d8ba_44ad_a7bd_3d3e00ab7641.slice/crio-6dc05efe4dd9812cad8ade7f8719e48cb68d457268ee76efc0dcfb67db81c71e WatchSource:0}: Error finding container 6dc05efe4dd9812cad8ade7f8719e48cb68d457268ee76efc0dcfb67db81c71e: Status 404 returned error can't find the container with id 6dc05efe4dd9812cad8ade7f8719e48cb68d457268ee76efc0dcfb67db81c71e Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.582068 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" event={"ID":"64728a68-4675-4652-a800-7f055197862b","Type":"ContainerStarted","Data":"150e49a38897494fd79421901164b2fd14df8a16c15fa522cbe3c4c5208499c3"} Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.584415 4703 generic.go:334] "Generic (PLEG): container finished" podID="be244c0d-2805-4b7c-8c21-d895e2d037f0" containerID="7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b" exitCode=0 Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.584483 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" event={"ID":"be244c0d-2805-4b7c-8c21-d895e2d037f0","Type":"ContainerDied","Data":"7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b"} Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.584510 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" event={"ID":"be244c0d-2805-4b7c-8c21-d895e2d037f0","Type":"ContainerDied","Data":"0f34f11990115b645cf14480c79c23366d7422ed40a244171b5502463bfa89ae"} Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.584532 4703 scope.go:117] "RemoveContainer" containerID="7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.584569 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.586173 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" event={"ID":"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641","Type":"ContainerStarted","Data":"6dc05efe4dd9812cad8ade7f8719e48cb68d457268ee76efc0dcfb67db81c71e"} Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.590774 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-rdp77" event={"ID":"ba5e15b8-80e8-4d9c-83eb-12dd004ca901","Type":"ContainerStarted","Data":"aba2307c3a2d038f289dc933c9a06d74f0d61651d04f08d6bdf587dc8d0dd961"} Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.595983 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" podStartSLOduration=105.221833464 podStartE2EDuration="2m33.595965382s" podCreationTimestamp="2026-03-09 13:22:00 +0000 UTC" firstStartedPulling="2026-03-09 13:23:44.885693783 +0000 UTC m=+220.853109469" lastFinishedPulling="2026-03-09 13:24:33.259825701 +0000 UTC m=+269.227241387" observedRunningTime="2026-03-09 13:24:33.594021175 +0000 UTC m=+269.561436861" watchObservedRunningTime="2026-03-09 13:24:33.595965382 +0000 UTC m=+269.563381068" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.601393 4703 generic.go:334] "Generic (PLEG): container finished" podID="98e2c5bf-bd56-48af-bf65-e095f4d3767e" containerID="4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c" exitCode=0 Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.601661 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.604068 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" event={"ID":"98e2c5bf-bd56-48af-bf65-e095f4d3767e","Type":"ContainerDied","Data":"4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c"} Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.605916 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" event={"ID":"98e2c5bf-bd56-48af-bf65-e095f4d3767e","Type":"ContainerDied","Data":"9edbf2988ba692104508d60a33ce2c2f0582882185ac0a353cbfdc94e6e85e01"} Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.624029 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551044-rdp77" podStartSLOduration=30.862204034 podStartE2EDuration="33.624008689s" podCreationTimestamp="2026-03-09 13:24:00 +0000 UTC" firstStartedPulling="2026-03-09 13:24:30.277458422 +0000 UTC m=+266.244874108" lastFinishedPulling="2026-03-09 13:24:33.039263077 +0000 UTC m=+269.006678763" observedRunningTime="2026-03-09 13:24:33.615282432 +0000 UTC m=+269.582698118" watchObservedRunningTime="2026-03-09 13:24:33.624008689 +0000 UTC m=+269.591424375" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.629621 4703 scope.go:117] "RemoveContainer" containerID="7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b" Mar 09 13:24:33 crc kubenswrapper[4703]: E0309 13:24:33.630046 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b\": container with ID starting with 7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b not found: ID does not exist" containerID="7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.630094 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b"} err="failed to get container status \"7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b\": rpc error: code = NotFound desc = could not find container \"7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b\": container with ID starting with 7714c18e7e3419d42ba3f9fd09a2d5894f63dbf3d37e0bb14eec607e00f4969b not found: ID does not exist" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.630124 4703 scope.go:117] "RemoveContainer" containerID="4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.635618 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6"] Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.640996 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d778f4477-zz4q6"] Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.651873 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-759d786675-njnpd"] Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.655555 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-759d786675-njnpd"] Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.655817 4703 scope.go:117] "RemoveContainer" containerID="4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c" Mar 09 13:24:33 crc kubenswrapper[4703]: E0309 13:24:33.656436 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c\": container with ID starting with 4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c not found: ID does not exist" containerID="4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.656486 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c"} err="failed to get container status \"4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c\": rpc error: code = NotFound desc = could not find container \"4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c\": container with ID starting with 4c116fbeae6f2d382f911529db5d2165ed8f033b99fdcd41404833de1b71a27c not found: ID does not exist" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.692740 4703 csr.go:261] certificate signing request csr-gsc69 is approved, waiting to be issued Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.701078 4703 csr.go:257] certificate signing request csr-gsc69 is issued Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.706916 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:24:33 crc kubenswrapper[4703]: E0309 13:24:33.707202 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e2c5bf-bd56-48af-bf65-e095f4d3767e" containerName="controller-manager" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.707220 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e2c5bf-bd56-48af-bf65-e095f4d3767e" containerName="controller-manager" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.707319 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e2c5bf-bd56-48af-bf65-e095f4d3767e" containerName="controller-manager" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.707661 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.707745 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.785976 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-var-lock\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.786219 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.786267 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.849717 4703 patch_prober.go:28] interesting pod/controller-manager-759d786675-njnpd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: i/o timeout" start-of-body= Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.849772 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-759d786675-njnpd" podUID="98e2c5bf-bd56-48af-bf65-e095f4d3767e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: i/o timeout" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.887921 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.887997 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.888017 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.888060 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-var-lock\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.888037 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-var-lock\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.910804 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kube-api-access\") pod \"installer-9-crc\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.956961 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:24:33 crc kubenswrapper[4703]: I0309 13:24:33.959870 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.090424 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2008051d-eda2-484e-815f-d8bd152eaa6f-kube-api-access\") pod \"2008051d-eda2-484e-815f-d8bd152eaa6f\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.090901 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2008051d-eda2-484e-815f-d8bd152eaa6f-kubelet-dir\") pod \"2008051d-eda2-484e-815f-d8bd152eaa6f\" (UID: \"2008051d-eda2-484e-815f-d8bd152eaa6f\") " Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.091021 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2008051d-eda2-484e-815f-d8bd152eaa6f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2008051d-eda2-484e-815f-d8bd152eaa6f" (UID: "2008051d-eda2-484e-815f-d8bd152eaa6f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.091204 4703 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2008051d-eda2-484e-815f-d8bd152eaa6f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.093836 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2008051d-eda2-484e-815f-d8bd152eaa6f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2008051d-eda2-484e-815f-d8bd152eaa6f" (UID: "2008051d-eda2-484e-815f-d8bd152eaa6f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.157287 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.192149 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2008051d-eda2-484e-815f-d8bd152eaa6f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.608138 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6bb052e-48ab-43a1-9e9a-c8109a014b1d","Type":"ContainerStarted","Data":"6a2e49c053a553a6e957409e956be34e74e84f3363b675ff9fbf1ea135d390f1"} Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.608204 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6bb052e-48ab-43a1-9e9a-c8109a014b1d","Type":"ContainerStarted","Data":"ceecff7ae89257a2e7ad3ed4dfd6583e466e8e6a4055a25f6c3cf93d0ce99348"} Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.611653 4703 generic.go:334] "Generic (PLEG): container finished" podID="ba5e15b8-80e8-4d9c-83eb-12dd004ca901" containerID="aba2307c3a2d038f289dc933c9a06d74f0d61651d04f08d6bdf587dc8d0dd961" exitCode=0 Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.611856 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-rdp77" event={"ID":"ba5e15b8-80e8-4d9c-83eb-12dd004ca901","Type":"ContainerDied","Data":"aba2307c3a2d038f289dc933c9a06d74f0d61651d04f08d6bdf587dc8d0dd961"} Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.614643 4703 generic.go:334] "Generic (PLEG): container finished" podID="64728a68-4675-4652-a800-7f055197862b" containerID="150e49a38897494fd79421901164b2fd14df8a16c15fa522cbe3c4c5208499c3" exitCode=0 Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.614685 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" event={"ID":"64728a68-4675-4652-a800-7f055197862b","Type":"ContainerDied","Data":"150e49a38897494fd79421901164b2fd14df8a16c15fa522cbe3c4c5208499c3"} Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.619109 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"2008051d-eda2-484e-815f-d8bd152eaa6f","Type":"ContainerDied","Data":"ed46b89519cdc573fd723af26e904c199b55a42f43fe2fc88076c6f65ef36c6b"} Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.619172 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed46b89519cdc573fd723af26e904c199b55a42f43fe2fc88076c6f65ef36c6b" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.619253 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.621079 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" event={"ID":"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641","Type":"ContainerStarted","Data":"90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da"} Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.621158 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.626157 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.631939 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.6319240289999999 podStartE2EDuration="1.631924029s" podCreationTimestamp="2026-03-09 13:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:34.630507147 +0000 UTC m=+270.597922843" watchObservedRunningTime="2026-03-09 13:24:34.631924029 +0000 UTC m=+270.599339715" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.651991 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" podStartSLOduration=9.65197425 podStartE2EDuration="9.65197425s" podCreationTimestamp="2026-03-09 13:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:34.650345932 +0000 UTC m=+270.617761618" watchObservedRunningTime="2026-03-09 13:24:34.65197425 +0000 UTC m=+270.619389936" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.702330 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-18 04:25:41.935972767 +0000 UTC Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.702370 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7551h1m7.233605341s for next certificate rotation Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.714537 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e2c5bf-bd56-48af-bf65-e095f4d3767e" path="/var/lib/kubelet/pods/98e2c5bf-bd56-48af-bf65-e095f4d3767e/volumes" Mar 09 13:24:34 crc kubenswrapper[4703]: I0309 13:24:34.715326 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be244c0d-2805-4b7c-8c21-d895e2d037f0" path="/var/lib/kubelet/pods/be244c0d-2805-4b7c-8c21-d895e2d037f0/volumes" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.535094 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-657f5f47cd-9qrrg"] Mar 09 13:24:35 crc kubenswrapper[4703]: E0309 13:24:35.535645 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2008051d-eda2-484e-815f-d8bd152eaa6f" containerName="pruner" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.535659 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2008051d-eda2-484e-815f-d8bd152eaa6f" containerName="pruner" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.535790 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="2008051d-eda2-484e-815f-d8bd152eaa6f" containerName="pruner" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.536251 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.538167 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.538509 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.538651 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.538689 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.538662 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.542209 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.544705 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.548080 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-657f5f47cd-9qrrg"] Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.608183 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbzls\" (UniqueName: \"kubernetes.io/projected/3c687525-2ca1-4c9d-8690-1f66e7772c37-kube-api-access-tbzls\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.608239 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-config\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.608286 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c687525-2ca1-4c9d-8690-1f66e7772c37-serving-cert\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.608323 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-client-ca\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.608364 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-proxy-ca-bundles\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.702938 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-01 00:52:57.330608153 +0000 UTC Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.702993 4703 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6395h28m21.627617972s for next certificate rotation Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.709824 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-proxy-ca-bundles\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.709952 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbzls\" (UniqueName: \"kubernetes.io/projected/3c687525-2ca1-4c9d-8690-1f66e7772c37-kube-api-access-tbzls\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.709979 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-config\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.710043 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c687525-2ca1-4c9d-8690-1f66e7772c37-serving-cert\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.710683 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-client-ca\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.711678 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-client-ca\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.711682 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-proxy-ca-bundles\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.712225 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-config\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.716462 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c687525-2ca1-4c9d-8690-1f66e7772c37-serving-cert\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.730022 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbzls\" (UniqueName: \"kubernetes.io/projected/3c687525-2ca1-4c9d-8690-1f66e7772c37-kube-api-access-tbzls\") pod \"controller-manager-657f5f47cd-9qrrg\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.865582 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.871470 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-rdp77" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.885064 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.913126 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2mmt\" (UniqueName: \"kubernetes.io/projected/64728a68-4675-4652-a800-7f055197862b-kube-api-access-h2mmt\") pod \"64728a68-4675-4652-a800-7f055197862b\" (UID: \"64728a68-4675-4652-a800-7f055197862b\") " Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.913293 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9wql\" (UniqueName: \"kubernetes.io/projected/ba5e15b8-80e8-4d9c-83eb-12dd004ca901-kube-api-access-l9wql\") pod \"ba5e15b8-80e8-4d9c-83eb-12dd004ca901\" (UID: \"ba5e15b8-80e8-4d9c-83eb-12dd004ca901\") " Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.916173 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5e15b8-80e8-4d9c-83eb-12dd004ca901-kube-api-access-l9wql" (OuterVolumeSpecName: "kube-api-access-l9wql") pod "ba5e15b8-80e8-4d9c-83eb-12dd004ca901" (UID: "ba5e15b8-80e8-4d9c-83eb-12dd004ca901"). InnerVolumeSpecName "kube-api-access-l9wql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:35 crc kubenswrapper[4703]: I0309 13:24:35.916392 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64728a68-4675-4652-a800-7f055197862b-kube-api-access-h2mmt" (OuterVolumeSpecName: "kube-api-access-h2mmt") pod "64728a68-4675-4652-a800-7f055197862b" (UID: "64728a68-4675-4652-a800-7f055197862b"). InnerVolumeSpecName "kube-api-access-h2mmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.015217 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9wql\" (UniqueName: \"kubernetes.io/projected/ba5e15b8-80e8-4d9c-83eb-12dd004ca901-kube-api-access-l9wql\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.015529 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2mmt\" (UniqueName: \"kubernetes.io/projected/64728a68-4675-4652-a800-7f055197862b-kube-api-access-h2mmt\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.267857 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-657f5f47cd-9qrrg"] Mar 09 13:24:36 crc kubenswrapper[4703]: W0309 13:24:36.274171 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c687525_2ca1_4c9d_8690_1f66e7772c37.slice/crio-2609c2111a2d3e4ec80283b992039a5b60dbbd3710162fa2a24e6fca18642f32 WatchSource:0}: Error finding container 2609c2111a2d3e4ec80283b992039a5b60dbbd3710162fa2a24e6fca18642f32: Status 404 returned error can't find the container with id 2609c2111a2d3e4ec80283b992039a5b60dbbd3710162fa2a24e6fca18642f32 Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.634939 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" event={"ID":"64728a68-4675-4652-a800-7f055197862b","Type":"ContainerDied","Data":"a95a5b95b41a1da0f96330b97a3b100cd810121f09a78b3d2a63aa8174ca1c16"} Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.634976 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a95a5b95b41a1da0f96330b97a3b100cd810121f09a78b3d2a63aa8174ca1c16" Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.635024 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551042-fv4ms" Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.639383 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" event={"ID":"3c687525-2ca1-4c9d-8690-1f66e7772c37","Type":"ContainerStarted","Data":"2609c2111a2d3e4ec80283b992039a5b60dbbd3710162fa2a24e6fca18642f32"} Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.641994 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551044-rdp77" Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.642978 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551044-rdp77" event={"ID":"ba5e15b8-80e8-4d9c-83eb-12dd004ca901","Type":"ContainerDied","Data":"0af3864c2cde18e1ed3b5035508cf3132264890774db509235e40f6d605d4f3a"} Mar 09 13:24:36 crc kubenswrapper[4703]: I0309 13:24:36.643037 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af3864c2cde18e1ed3b5035508cf3132264890774db509235e40f6d605d4f3a" Mar 09 13:24:37 crc kubenswrapper[4703]: I0309 13:24:37.647549 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" event={"ID":"3c687525-2ca1-4c9d-8690-1f66e7772c37","Type":"ContainerStarted","Data":"1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e"} Mar 09 13:24:37 crc kubenswrapper[4703]: I0309 13:24:37.648099 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:37 crc kubenswrapper[4703]: I0309 13:24:37.656015 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:37 crc kubenswrapper[4703]: I0309 13:24:37.670454 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" podStartSLOduration=12.670433003 podStartE2EDuration="12.670433003s" podCreationTimestamp="2026-03-09 13:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:37.666008693 +0000 UTC m=+273.633424379" watchObservedRunningTime="2026-03-09 13:24:37.670433003 +0000 UTC m=+273.637848689" Mar 09 13:24:39 crc kubenswrapper[4703]: I0309 13:24:39.505333 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:24:39 crc kubenswrapper[4703]: I0309 13:24:39.505429 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:24:39 crc kubenswrapper[4703]: I0309 13:24:39.505501 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:24:39 crc kubenswrapper[4703]: I0309 13:24:39.506554 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:24:39 crc kubenswrapper[4703]: I0309 13:24:39.506615 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f" gracePeriod=600 Mar 09 13:24:40 crc kubenswrapper[4703]: I0309 13:24:40.663812 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f" exitCode=0 Mar 09 13:24:40 crc kubenswrapper[4703]: I0309 13:24:40.663867 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f"} Mar 09 13:24:41 crc kubenswrapper[4703]: I0309 13:24:41.672003 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"811feda9599d695e42cb52b203e2807fe7b8b341b8ecccd4904c7902531d2ff1"} Mar 09 13:24:43 crc kubenswrapper[4703]: I0309 13:24:43.684896 4703 generic.go:334] "Generic (PLEG): container finished" podID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerID="ff5f099ff9aed37285569f642ce22a2bc54361f18f227bbc4682c7fd61f767ca" exitCode=0 Mar 09 13:24:43 crc kubenswrapper[4703]: I0309 13:24:43.684932 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5rmg" event={"ID":"a7eceeab-0758-4f0f-88c7-5b744fbab868","Type":"ContainerDied","Data":"ff5f099ff9aed37285569f642ce22a2bc54361f18f227bbc4682c7fd61f767ca"} Mar 09 13:24:43 crc kubenswrapper[4703]: I0309 13:24:43.687434 4703 generic.go:334] "Generic (PLEG): container finished" podID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerID="f935878b863d664b48d1afa38d3f3997820f8c387d0ace58938614e1ac4614f4" exitCode=0 Mar 09 13:24:43 crc kubenswrapper[4703]: I0309 13:24:43.687491 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tq2q" event={"ID":"adee29a3-c626-4dc3-8323-ce1e852faea3","Type":"ContainerDied","Data":"f935878b863d664b48d1afa38d3f3997820f8c387d0ace58938614e1ac4614f4"} Mar 09 13:24:44 crc kubenswrapper[4703]: I0309 13:24:44.696572 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5rmg" event={"ID":"a7eceeab-0758-4f0f-88c7-5b744fbab868","Type":"ContainerStarted","Data":"f85f27e053a633e84612bfec49a94a8ef0d8c405ff58452f07501008893cb1e3"} Mar 09 13:24:44 crc kubenswrapper[4703]: I0309 13:24:44.705182 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tq2q" event={"ID":"adee29a3-c626-4dc3-8323-ce1e852faea3","Type":"ContainerStarted","Data":"6429dd51afe70aa90258b9aae53ce56e1a8b3e9c6bb03437e602a91dd69bf2d8"} Mar 09 13:24:44 crc kubenswrapper[4703]: I0309 13:24:44.727620 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j5rmg" podStartSLOduration=2.545771582 podStartE2EDuration="54.727594793s" podCreationTimestamp="2026-03-09 13:23:50 +0000 UTC" firstStartedPulling="2026-03-09 13:23:51.944233745 +0000 UTC m=+227.911649431" lastFinishedPulling="2026-03-09 13:24:44.126056956 +0000 UTC m=+280.093472642" observedRunningTime="2026-03-09 13:24:44.725151041 +0000 UTC m=+280.692566727" watchObservedRunningTime="2026-03-09 13:24:44.727594793 +0000 UTC m=+280.695010519" Mar 09 13:24:44 crc kubenswrapper[4703]: I0309 13:24:44.784907 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9tq2q" podStartSLOduration=2.433796388 podStartE2EDuration="55.784884653s" podCreationTimestamp="2026-03-09 13:23:49 +0000 UTC" firstStartedPulling="2026-03-09 13:23:50.887970003 +0000 UTC m=+226.855385689" lastFinishedPulling="2026-03-09 13:24:44.239058268 +0000 UTC m=+280.206473954" observedRunningTime="2026-03-09 13:24:44.783424259 +0000 UTC m=+280.750839995" watchObservedRunningTime="2026-03-09 13:24:44.784884653 +0000 UTC m=+280.752300349" Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.678582 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-657f5f47cd-9qrrg"] Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.679055 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" podUID="3c687525-2ca1-4c9d-8690-1f66e7772c37" containerName="controller-manager" containerID="cri-o://1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e" gracePeriod=30 Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.701481 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7"] Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.701670 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" podUID="4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" containerName="route-controller-manager" containerID="cri-o://90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da" gracePeriod=30 Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.715180 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerStarted","Data":"f419bb8679aebc16727a480ccee52b111fcc5fd59cb0d7aeb8bbe7a1281ecd6a"} Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.720532 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gzfj" event={"ID":"7c0fe86d-3006-44b3-811a-f28758ef07fd","Type":"ContainerStarted","Data":"2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6"} Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.721927 4703 generic.go:334] "Generic (PLEG): container finished" podID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerID="ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc" exitCode=0 Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.721993 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gkg4j" event={"ID":"09a58d8e-1a70-4e6c-b217-1c2b2537accc","Type":"ContainerDied","Data":"ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc"} Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.725739 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerStarted","Data":"bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738"} Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.887699 4703 patch_prober.go:28] interesting pod/controller-manager-657f5f47cd-9qrrg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 09 13:24:45 crc kubenswrapper[4703]: I0309 13:24:45.887766 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" podUID="3c687525-2ca1-4c9d-8690-1f66e7772c37" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.218714 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.324253 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-client-ca\") pod \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.324356 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-serving-cert\") pod \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.324388 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hph4v\" (UniqueName: \"kubernetes.io/projected/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-kube-api-access-hph4v\") pod \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.324447 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-config\") pod \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\" (UID: \"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.325266 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-client-ca" (OuterVolumeSpecName: "client-ca") pod "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" (UID: "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.325383 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-config" (OuterVolumeSpecName: "config") pod "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" (UID: "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.330400 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-kube-api-access-hph4v" (OuterVolumeSpecName: "kube-api-access-hph4v") pod "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" (UID: "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641"). InnerVolumeSpecName "kube-api-access-hph4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.332185 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" (UID: "4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.388282 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.443673 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.443718 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.443733 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hph4v\" (UniqueName: \"kubernetes.io/projected/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-kube-api-access-hph4v\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.443745 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.544141 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c687525-2ca1-4c9d-8690-1f66e7772c37-serving-cert\") pod \"3c687525-2ca1-4c9d-8690-1f66e7772c37\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.544202 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbzls\" (UniqueName: \"kubernetes.io/projected/3c687525-2ca1-4c9d-8690-1f66e7772c37-kube-api-access-tbzls\") pod \"3c687525-2ca1-4c9d-8690-1f66e7772c37\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.544342 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-proxy-ca-bundles\") pod \"3c687525-2ca1-4c9d-8690-1f66e7772c37\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.544421 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-client-ca\") pod \"3c687525-2ca1-4c9d-8690-1f66e7772c37\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.544451 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-config\") pod \"3c687525-2ca1-4c9d-8690-1f66e7772c37\" (UID: \"3c687525-2ca1-4c9d-8690-1f66e7772c37\") " Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.545448 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3c687525-2ca1-4c9d-8690-1f66e7772c37" (UID: "3c687525-2ca1-4c9d-8690-1f66e7772c37"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.545541 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c687525-2ca1-4c9d-8690-1f66e7772c37" (UID: "3c687525-2ca1-4c9d-8690-1f66e7772c37"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.545555 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-config" (OuterVolumeSpecName: "config") pod "3c687525-2ca1-4c9d-8690-1f66e7772c37" (UID: "3c687525-2ca1-4c9d-8690-1f66e7772c37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.548352 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c687525-2ca1-4c9d-8690-1f66e7772c37-kube-api-access-tbzls" (OuterVolumeSpecName: "kube-api-access-tbzls") pod "3c687525-2ca1-4c9d-8690-1f66e7772c37" (UID: "3c687525-2ca1-4c9d-8690-1f66e7772c37"). InnerVolumeSpecName "kube-api-access-tbzls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.549004 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c687525-2ca1-4c9d-8690-1f66e7772c37-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c687525-2ca1-4c9d-8690-1f66e7772c37" (UID: "3c687525-2ca1-4c9d-8690-1f66e7772c37"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.645573 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.645816 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.645957 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c687525-2ca1-4c9d-8690-1f66e7772c37-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.646042 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c687525-2ca1-4c9d-8690-1f66e7772c37-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.646122 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbzls\" (UniqueName: \"kubernetes.io/projected/3c687525-2ca1-4c9d-8690-1f66e7772c37-kube-api-access-tbzls\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.734474 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gkg4j" event={"ID":"09a58d8e-1a70-4e6c-b217-1c2b2537accc","Type":"ContainerStarted","Data":"265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.736385 4703 generic.go:334] "Generic (PLEG): container finished" podID="4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" containerID="90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da" exitCode=0 Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.736442 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" event={"ID":"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641","Type":"ContainerDied","Data":"90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.736470 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" event={"ID":"4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641","Type":"ContainerDied","Data":"6dc05efe4dd9812cad8ade7f8719e48cb68d457268ee76efc0dcfb67db81c71e"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.736487 4703 scope.go:117] "RemoveContainer" containerID="90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.736580 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.739120 4703 generic.go:334] "Generic (PLEG): container finished" podID="0ab69d10-042e-422a-830e-65d3d1132197" containerID="bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738" exitCode=0 Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.739164 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerDied","Data":"bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.741638 4703 generic.go:334] "Generic (PLEG): container finished" podID="174b5189-1afe-40a3-813b-052dd29ad296" containerID="f419bb8679aebc16727a480ccee52b111fcc5fd59cb0d7aeb8bbe7a1281ecd6a" exitCode=0 Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.741832 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerDied","Data":"f419bb8679aebc16727a480ccee52b111fcc5fd59cb0d7aeb8bbe7a1281ecd6a"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.749349 4703 generic.go:334] "Generic (PLEG): container finished" podID="3c687525-2ca1-4c9d-8690-1f66e7772c37" containerID="1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e" exitCode=0 Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.749416 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" event={"ID":"3c687525-2ca1-4c9d-8690-1f66e7772c37","Type":"ContainerDied","Data":"1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.749450 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" event={"ID":"3c687525-2ca1-4c9d-8690-1f66e7772c37","Type":"ContainerDied","Data":"2609c2111a2d3e4ec80283b992039a5b60dbbd3710162fa2a24e6fca18642f32"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.749515 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-657f5f47cd-9qrrg" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.756485 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gkg4j" podStartSLOduration=2.767211913 podStartE2EDuration="54.756468268s" podCreationTimestamp="2026-03-09 13:23:52 +0000 UTC" firstStartedPulling="2026-03-09 13:23:54.236055101 +0000 UTC m=+230.203470787" lastFinishedPulling="2026-03-09 13:24:46.225311456 +0000 UTC m=+282.192727142" observedRunningTime="2026-03-09 13:24:46.756165139 +0000 UTC m=+282.723580825" watchObservedRunningTime="2026-03-09 13:24:46.756468268 +0000 UTC m=+282.723883954" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.757813 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m75f5" event={"ID":"75d9c9d6-9701-4627-a87b-b2ae669a0eae","Type":"ContainerStarted","Data":"955e9a798cc83fcd9da3cd05e3b96611174c2cec57237cac82eda3dcaead74a6"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.763183 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pqzw" event={"ID":"3916d955-3b83-488d-959c-88b09424a3e3","Type":"ContainerStarted","Data":"8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.775118 4703 generic.go:334] "Generic (PLEG): container finished" podID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerID="2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6" exitCode=0 Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.775184 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gzfj" event={"ID":"7c0fe86d-3006-44b3-811a-f28758ef07fd","Type":"ContainerDied","Data":"2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6"} Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.806191 4703 scope.go:117] "RemoveContainer" containerID="90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da" Mar 09 13:24:46 crc kubenswrapper[4703]: E0309 13:24:46.806736 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da\": container with ID starting with 90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da not found: ID does not exist" containerID="90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.806784 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da"} err="failed to get container status \"90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da\": rpc error: code = NotFound desc = could not find container \"90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da\": container with ID starting with 90a8b14bba906ffd9318e8fe3265796a00ea1ab6b4590ddb26312465293fc5da not found: ID does not exist" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.806812 4703 scope.go:117] "RemoveContainer" containerID="1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.822660 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-657f5f47cd-9qrrg"] Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.824002 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-657f5f47cd-9qrrg"] Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.864020 4703 scope.go:117] "RemoveContainer" containerID="1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e" Mar 09 13:24:46 crc kubenswrapper[4703]: E0309 13:24:46.864545 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e\": container with ID starting with 1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e not found: ID does not exist" containerID="1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.864572 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e"} err="failed to get container status \"1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e\": rpc error: code = NotFound desc = could not find container \"1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e\": container with ID starting with 1bebdce81fe85020d87c8246196eb2da8ec82074698a59d4d5c7966c2f4d433e not found: ID does not exist" Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.883931 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7"] Mar 09 13:24:46 crc kubenswrapper[4703]: I0309 13:24:46.887814 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf5d85c49-7fgv7"] Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.564767 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fd569c996-k954g"] Mar 09 13:24:47 crc kubenswrapper[4703]: E0309 13:24:47.565996 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64728a68-4675-4652-a800-7f055197862b" containerName="oc" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566041 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="64728a68-4675-4652-a800-7f055197862b" containerName="oc" Mar 09 13:24:47 crc kubenswrapper[4703]: E0309 13:24:47.566060 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5e15b8-80e8-4d9c-83eb-12dd004ca901" containerName="oc" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566067 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5e15b8-80e8-4d9c-83eb-12dd004ca901" containerName="oc" Mar 09 13:24:47 crc kubenswrapper[4703]: E0309 13:24:47.566097 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c687525-2ca1-4c9d-8690-1f66e7772c37" containerName="controller-manager" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566107 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c687525-2ca1-4c9d-8690-1f66e7772c37" containerName="controller-manager" Mar 09 13:24:47 crc kubenswrapper[4703]: E0309 13:24:47.566125 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" containerName="route-controller-manager" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566133 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" containerName="route-controller-manager" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566407 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5e15b8-80e8-4d9c-83eb-12dd004ca901" containerName="oc" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566423 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" containerName="route-controller-manager" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566453 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c687525-2ca1-4c9d-8690-1f66e7772c37" containerName="controller-manager" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.566464 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="64728a68-4675-4652-a800-7f055197862b" containerName="oc" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.569080 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.571923 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.572445 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.572576 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.572692 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.572788 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.573225 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.575082 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss"] Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.576034 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.580533 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.580763 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.581124 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.581150 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.581616 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.585409 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.586003 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.587259 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fd569c996-k954g"] Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.594612 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss"] Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.669077 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-client-ca\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.669138 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-config\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.669200 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-config\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.669246 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-serving-cert\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.669278 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4r5d\" (UniqueName: \"kubernetes.io/projected/58cdc96b-614c-4784-b144-dd817f60399c-kube-api-access-t4r5d\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.670713 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-proxy-ca-bundles\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.670782 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm4vh\" (UniqueName: \"kubernetes.io/projected/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-kube-api-access-sm4vh\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.670971 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58cdc96b-614c-4784-b144-dd817f60399c-serving-cert\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.671080 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-client-ca\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772478 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58cdc96b-614c-4784-b144-dd817f60399c-serving-cert\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772536 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-client-ca\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772604 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-client-ca\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772644 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-config\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772675 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-config\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772720 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-serving-cert\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772746 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4r5d\" (UniqueName: \"kubernetes.io/projected/58cdc96b-614c-4784-b144-dd817f60399c-kube-api-access-t4r5d\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772816 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-proxy-ca-bundles\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.772837 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm4vh\" (UniqueName: \"kubernetes.io/projected/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-kube-api-access-sm4vh\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.775830 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-proxy-ca-bundles\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.775969 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-config\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.776285 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-config\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.776785 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-client-ca\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.776882 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-client-ca\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.783910 4703 generic.go:334] "Generic (PLEG): container finished" podID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerID="955e9a798cc83fcd9da3cd05e3b96611174c2cec57237cac82eda3dcaead74a6" exitCode=0 Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.783974 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m75f5" event={"ID":"75d9c9d6-9701-4627-a87b-b2ae669a0eae","Type":"ContainerDied","Data":"955e9a798cc83fcd9da3cd05e3b96611174c2cec57237cac82eda3dcaead74a6"} Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.784003 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m75f5" event={"ID":"75d9c9d6-9701-4627-a87b-b2ae669a0eae","Type":"ContainerStarted","Data":"c1649cbb037f0ab5e9da86f0e4b7bc95cc003b46aa66f5a079246e7083ca2b70"} Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.785379 4703 generic.go:334] "Generic (PLEG): container finished" podID="3916d955-3b83-488d-959c-88b09424a3e3" containerID="8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184" exitCode=0 Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.785417 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pqzw" event={"ID":"3916d955-3b83-488d-959c-88b09424a3e3","Type":"ContainerDied","Data":"8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184"} Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.796060 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-serving-cert\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.796485 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58cdc96b-614c-4784-b144-dd817f60399c-serving-cert\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.797546 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm4vh\" (UniqueName: \"kubernetes.io/projected/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-kube-api-access-sm4vh\") pod \"controller-manager-fd569c996-k954g\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.797821 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4r5d\" (UniqueName: \"kubernetes.io/projected/58cdc96b-614c-4784-b144-dd817f60399c-kube-api-access-t4r5d\") pod \"route-controller-manager-76d69bdc5c-rmzss\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:47 crc kubenswrapper[4703]: I0309 13:24:47.994869 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.006127 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.299532 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss"] Mar 09 13:24:48 crc kubenswrapper[4703]: W0309 13:24:48.306377 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58cdc96b_614c_4784_b144_dd817f60399c.slice/crio-0163b868729b663933a06dcc3de16d0cc18a7f57793042d932baecd53b978704 WatchSource:0}: Error finding container 0163b868729b663933a06dcc3de16d0cc18a7f57793042d932baecd53b978704: Status 404 returned error can't find the container with id 0163b868729b663933a06dcc3de16d0cc18a7f57793042d932baecd53b978704 Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.465603 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fd569c996-k954g"] Mar 09 13:24:48 crc kubenswrapper[4703]: W0309 13:24:48.470139 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28cd609c_fb43_4b67_9ffb_1a49482bd6ac.slice/crio-36691124b31819182561e68f5bd649d9b25afdc487b902595b7bab3020babdda WatchSource:0}: Error finding container 36691124b31819182561e68f5bd649d9b25afdc487b902595b7bab3020babdda: Status 404 returned error can't find the container with id 36691124b31819182561e68f5bd649d9b25afdc487b902595b7bab3020babdda Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.718707 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c687525-2ca1-4c9d-8690-1f66e7772c37" path="/var/lib/kubelet/pods/3c687525-2ca1-4c9d-8690-1f66e7772c37/volumes" Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.719595 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641" path="/var/lib/kubelet/pods/4d2d7f1e-d8ba-44ad-a7bd-3d3e00ab7641/volumes" Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.795050 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" event={"ID":"58cdc96b-614c-4784-b144-dd817f60399c","Type":"ContainerStarted","Data":"4ea3f87802b135c036106271a704fb252ba227ca1b4ea25c024a71519d486e84"} Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.795969 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" event={"ID":"58cdc96b-614c-4784-b144-dd817f60399c","Type":"ContainerStarted","Data":"0163b868729b663933a06dcc3de16d0cc18a7f57793042d932baecd53b978704"} Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.796369 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.799900 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pqzw" event={"ID":"3916d955-3b83-488d-959c-88b09424a3e3","Type":"ContainerStarted","Data":"cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67"} Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.801795 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gzfj" event={"ID":"7c0fe86d-3006-44b3-811a-f28758ef07fd","Type":"ContainerStarted","Data":"f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9"} Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.804113 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerStarted","Data":"b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108"} Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.806438 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerStarted","Data":"4c52deea3c2497e76ca0e8aeee775c9afd2392c12f3ab422874576d073fda039"} Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.807946 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" event={"ID":"28cd609c-fb43-4b67-9ffb-1a49482bd6ac","Type":"ContainerStarted","Data":"36691124b31819182561e68f5bd649d9b25afdc487b902595b7bab3020babdda"} Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.819722 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" podStartSLOduration=3.819703895 podStartE2EDuration="3.819703895s" podCreationTimestamp="2026-03-09 13:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:48.816595103 +0000 UTC m=+284.784010789" watchObservedRunningTime="2026-03-09 13:24:48.819703895 +0000 UTC m=+284.787119581" Mar 09 13:24:48 crc kubenswrapper[4703]: I0309 13:24:48.841479 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zc87r" podStartSLOduration=4.132913945 podStartE2EDuration="1m0.841462136s" podCreationTimestamp="2026-03-09 13:23:48 +0000 UTC" firstStartedPulling="2026-03-09 13:23:50.854480889 +0000 UTC m=+226.821896565" lastFinishedPulling="2026-03-09 13:24:47.56302907 +0000 UTC m=+283.530444756" observedRunningTime="2026-03-09 13:24:48.837571922 +0000 UTC m=+284.804987598" watchObservedRunningTime="2026-03-09 13:24:48.841462136 +0000 UTC m=+284.808877822" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:48.857004 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m75f5" podStartSLOduration=3.522761438 podStartE2EDuration="57.856988074s" podCreationTimestamp="2026-03-09 13:23:51 +0000 UTC" firstStartedPulling="2026-03-09 13:23:53.075203695 +0000 UTC m=+229.042619381" lastFinishedPulling="2026-03-09 13:24:47.409430301 +0000 UTC m=+283.376846017" observedRunningTime="2026-03-09 13:24:48.854346316 +0000 UTC m=+284.821762002" watchObservedRunningTime="2026-03-09 13:24:48.856988074 +0000 UTC m=+284.824403760" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:48.874205 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pqzw" podStartSLOduration=2.807452537 podStartE2EDuration="57.874189431s" podCreationTimestamp="2026-03-09 13:23:51 +0000 UTC" firstStartedPulling="2026-03-09 13:23:53.130469485 +0000 UTC m=+229.097885171" lastFinishedPulling="2026-03-09 13:24:48.197206379 +0000 UTC m=+284.164622065" observedRunningTime="2026-03-09 13:24:48.873607724 +0000 UTC m=+284.841023420" watchObservedRunningTime="2026-03-09 13:24:48.874189431 +0000 UTC m=+284.841605117" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:48.900772 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9gzfj" podStartSLOduration=4.284109262 podStartE2EDuration="1m0.900740524s" podCreationTimestamp="2026-03-09 13:23:48 +0000 UTC" firstStartedPulling="2026-03-09 13:23:50.848396528 +0000 UTC m=+226.815812214" lastFinishedPulling="2026-03-09 13:24:47.46502779 +0000 UTC m=+283.432443476" observedRunningTime="2026-03-09 13:24:48.897415616 +0000 UTC m=+284.864831302" watchObservedRunningTime="2026-03-09 13:24:48.900740524 +0000 UTC m=+284.868156210" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:48.945650 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:48.972877 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kvftn" podStartSLOduration=4.177558662 podStartE2EDuration="1m0.97283537s" podCreationTimestamp="2026-03-09 13:23:48 +0000 UTC" firstStartedPulling="2026-03-09 13:23:50.857519239 +0000 UTC m=+226.824934925" lastFinishedPulling="2026-03-09 13:24:47.652795947 +0000 UTC m=+283.620211633" observedRunningTime="2026-03-09 13:24:48.936967633 +0000 UTC m=+284.904383319" watchObservedRunningTime="2026-03-09 13:24:48.97283537 +0000 UTC m=+284.940251066" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.072335 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.072398 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.157046 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.157092 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.456454 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.456797 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.474445 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.474490 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.585455 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.816626 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" event={"ID":"28cd609c-fb43-4b67-9ffb-1a49482bd6ac","Type":"ContainerStarted","Data":"d02a2634b76f2240d28a3767ab5adfbba2b694a8f5b081bbca709e2f7a39064c"} Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.818591 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.825812 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.836417 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" podStartSLOduration=4.836400574 podStartE2EDuration="4.836400574s" podCreationTimestamp="2026-03-09 13:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:24:49.83355882 +0000 UTC m=+285.800974506" watchObservedRunningTime="2026-03-09 13:24:49.836400574 +0000 UTC m=+285.803816260" Mar 09 13:24:49 crc kubenswrapper[4703]: I0309 13:24:49.879735 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:24:50 crc kubenswrapper[4703]: I0309 13:24:50.561349 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zc87r" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="registry-server" probeResult="failure" output=< Mar 09 13:24:50 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:24:50 crc kubenswrapper[4703]: > Mar 09 13:24:50 crc kubenswrapper[4703]: I0309 13:24:50.561654 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9gzfj" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="registry-server" probeResult="failure" output=< Mar 09 13:24:50 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:24:50 crc kubenswrapper[4703]: > Mar 09 13:24:50 crc kubenswrapper[4703]: I0309 13:24:50.563326 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-kvftn" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="registry-server" probeResult="failure" output=< Mar 09 13:24:50 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:24:50 crc kubenswrapper[4703]: > Mar 09 13:24:51 crc kubenswrapper[4703]: I0309 13:24:51.080778 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:24:51 crc kubenswrapper[4703]: I0309 13:24:51.080875 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:24:51 crc kubenswrapper[4703]: I0309 13:24:51.131678 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:24:51 crc kubenswrapper[4703]: I0309 13:24:51.482001 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:24:51 crc kubenswrapper[4703]: I0309 13:24:51.482652 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:24:51 crc kubenswrapper[4703]: I0309 13:24:51.524422 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:24:51 crc kubenswrapper[4703]: I0309 13:24:51.876289 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:24:52 crc kubenswrapper[4703]: I0309 13:24:52.123814 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:24:52 crc kubenswrapper[4703]: I0309 13:24:52.123872 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:24:52 crc kubenswrapper[4703]: I0309 13:24:52.498640 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:24:52 crc kubenswrapper[4703]: I0309 13:24:52.498692 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:24:52 crc kubenswrapper[4703]: I0309 13:24:52.942651 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tq2q"] Mar 09 13:24:52 crc kubenswrapper[4703]: I0309 13:24:52.942946 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9tq2q" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="registry-server" containerID="cri-o://6429dd51afe70aa90258b9aae53ce56e1a8b3e9c6bb03437e602a91dd69bf2d8" gracePeriod=2 Mar 09 13:24:53 crc kubenswrapper[4703]: I0309 13:24:53.163724 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m75f5" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="registry-server" probeResult="failure" output=< Mar 09 13:24:53 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:24:53 crc kubenswrapper[4703]: > Mar 09 13:24:53 crc kubenswrapper[4703]: I0309 13:24:53.543441 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gkg4j" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="registry-server" probeResult="failure" output=< Mar 09 13:24:53 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:24:53 crc kubenswrapper[4703]: > Mar 09 13:24:53 crc kubenswrapper[4703]: I0309 13:24:53.841521 4703 generic.go:334] "Generic (PLEG): container finished" podID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerID="6429dd51afe70aa90258b9aae53ce56e1a8b3e9c6bb03437e602a91dd69bf2d8" exitCode=0 Mar 09 13:24:53 crc kubenswrapper[4703]: I0309 13:24:53.841565 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tq2q" event={"ID":"adee29a3-c626-4dc3-8323-ce1e852faea3","Type":"ContainerDied","Data":"6429dd51afe70aa90258b9aae53ce56e1a8b3e9c6bb03437e602a91dd69bf2d8"} Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.165145 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.267294 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gttc\" (UniqueName: \"kubernetes.io/projected/adee29a3-c626-4dc3-8323-ce1e852faea3-kube-api-access-6gttc\") pod \"adee29a3-c626-4dc3-8323-ce1e852faea3\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.267440 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-utilities\") pod \"adee29a3-c626-4dc3-8323-ce1e852faea3\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.267480 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-catalog-content\") pod \"adee29a3-c626-4dc3-8323-ce1e852faea3\" (UID: \"adee29a3-c626-4dc3-8323-ce1e852faea3\") " Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.276244 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adee29a3-c626-4dc3-8323-ce1e852faea3-kube-api-access-6gttc" (OuterVolumeSpecName: "kube-api-access-6gttc") pod "adee29a3-c626-4dc3-8323-ce1e852faea3" (UID: "adee29a3-c626-4dc3-8323-ce1e852faea3"). InnerVolumeSpecName "kube-api-access-6gttc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.276592 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-utilities" (OuterVolumeSpecName: "utilities") pod "adee29a3-c626-4dc3-8323-ce1e852faea3" (UID: "adee29a3-c626-4dc3-8323-ce1e852faea3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.343921 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adee29a3-c626-4dc3-8323-ce1e852faea3" (UID: "adee29a3-c626-4dc3-8323-ce1e852faea3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.368518 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gttc\" (UniqueName: \"kubernetes.io/projected/adee29a3-c626-4dc3-8323-ce1e852faea3-kube-api-access-6gttc\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.368551 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.368562 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adee29a3-c626-4dc3-8323-ce1e852faea3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.851506 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tq2q" event={"ID":"adee29a3-c626-4dc3-8323-ce1e852faea3","Type":"ContainerDied","Data":"a79b07add877772bfcd1f6fbc83d5ce903f6d5a0078394174bbfbd686558f956"} Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.851829 4703 scope.go:117] "RemoveContainer" containerID="6429dd51afe70aa90258b9aae53ce56e1a8b3e9c6bb03437e602a91dd69bf2d8" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.851750 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tq2q" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.870280 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tq2q"] Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.875174 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9tq2q"] Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.881742 4703 scope.go:117] "RemoveContainer" containerID="f935878b863d664b48d1afa38d3f3997820f8c387d0ace58938614e1ac4614f4" Mar 09 13:24:54 crc kubenswrapper[4703]: I0309 13:24:54.899763 4703 scope.go:117] "RemoveContainer" containerID="41e1c1cf8f3dfd6382fe1d8bdb300394e54950f0864775a17b5cccef6a5fbd85" Mar 09 13:24:56 crc kubenswrapper[4703]: I0309 13:24:56.713343 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" path="/var/lib/kubelet/pods/adee29a3-c626-4dc3-8323-ce1e852faea3/volumes" Mar 09 13:24:59 crc kubenswrapper[4703]: I0309 13:24:59.123884 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:24:59 crc kubenswrapper[4703]: I0309 13:24:59.184023 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:24:59 crc kubenswrapper[4703]: I0309 13:24:59.252155 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:24:59 crc kubenswrapper[4703]: I0309 13:24:59.292420 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:24:59 crc kubenswrapper[4703]: I0309 13:24:59.512040 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:24:59 crc kubenswrapper[4703]: I0309 13:24:59.569349 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:25:00 crc kubenswrapper[4703]: I0309 13:25:00.563325 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gzfj"] Mar 09 13:25:00 crc kubenswrapper[4703]: I0309 13:25:00.885808 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9gzfj" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="registry-server" containerID="cri-o://f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9" gracePeriod=2 Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.350404 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.379319 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ghbk9"] Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.462779 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwqvr\" (UniqueName: \"kubernetes.io/projected/7c0fe86d-3006-44b3-811a-f28758ef07fd-kube-api-access-mwqvr\") pod \"7c0fe86d-3006-44b3-811a-f28758ef07fd\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.462982 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-catalog-content\") pod \"7c0fe86d-3006-44b3-811a-f28758ef07fd\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.463027 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-utilities\") pod \"7c0fe86d-3006-44b3-811a-f28758ef07fd\" (UID: \"7c0fe86d-3006-44b3-811a-f28758ef07fd\") " Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.465447 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-utilities" (OuterVolumeSpecName: "utilities") pod "7c0fe86d-3006-44b3-811a-f28758ef07fd" (UID: "7c0fe86d-3006-44b3-811a-f28758ef07fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.472131 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0fe86d-3006-44b3-811a-f28758ef07fd-kube-api-access-mwqvr" (OuterVolumeSpecName: "kube-api-access-mwqvr") pod "7c0fe86d-3006-44b3-811a-f28758ef07fd" (UID: "7c0fe86d-3006-44b3-811a-f28758ef07fd"). InnerVolumeSpecName "kube-api-access-mwqvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.537515 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.558378 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c0fe86d-3006-44b3-811a-f28758ef07fd" (UID: "7c0fe86d-3006-44b3-811a-f28758ef07fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.564531 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.564567 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fe86d-3006-44b3-811a-f28758ef07fd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.564577 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwqvr\" (UniqueName: \"kubernetes.io/projected/7c0fe86d-3006-44b3-811a-f28758ef07fd-kube-api-access-mwqvr\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.892279 4703 generic.go:334] "Generic (PLEG): container finished" podID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerID="f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9" exitCode=0 Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.892317 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gzfj" event={"ID":"7c0fe86d-3006-44b3-811a-f28758ef07fd","Type":"ContainerDied","Data":"f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9"} Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.892341 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gzfj" event={"ID":"7c0fe86d-3006-44b3-811a-f28758ef07fd","Type":"ContainerDied","Data":"ae456d88e5213923bd809ea0cefaa26759897e2a30818214d10cca2732e31ce8"} Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.892357 4703 scope.go:117] "RemoveContainer" containerID="f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.892451 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gzfj" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.913790 4703 scope.go:117] "RemoveContainer" containerID="2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.924116 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gzfj"] Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.931208 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9gzfj"] Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.942758 4703 scope.go:117] "RemoveContainer" containerID="0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.961819 4703 scope.go:117] "RemoveContainer" containerID="f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9" Mar 09 13:25:01 crc kubenswrapper[4703]: E0309 13:25:01.962616 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9\": container with ID starting with f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9 not found: ID does not exist" containerID="f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.962683 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9"} err="failed to get container status \"f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9\": rpc error: code = NotFound desc = could not find container \"f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9\": container with ID starting with f27338878355dd383f36c72cdd0f9d228c67b9784548da4055b6646e2cef10b9 not found: ID does not exist" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.962715 4703 scope.go:117] "RemoveContainer" containerID="2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6" Mar 09 13:25:01 crc kubenswrapper[4703]: E0309 13:25:01.963048 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6\": container with ID starting with 2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6 not found: ID does not exist" containerID="2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.963071 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6"} err="failed to get container status \"2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6\": rpc error: code = NotFound desc = could not find container \"2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6\": container with ID starting with 2f62f352c6264b3dcfb8d64697b49cbd742af74867f6830391d8064111810af6 not found: ID does not exist" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.963087 4703 scope.go:117] "RemoveContainer" containerID="0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938" Mar 09 13:25:01 crc kubenswrapper[4703]: E0309 13:25:01.963277 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938\": container with ID starting with 0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938 not found: ID does not exist" containerID="0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938" Mar 09 13:25:01 crc kubenswrapper[4703]: I0309 13:25:01.963296 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938"} err="failed to get container status \"0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938\": rpc error: code = NotFound desc = could not find container \"0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938\": container with ID starting with 0e0e3bde08befa95e70e60806abba8fb4b215fda341789e25b4085443a19b938 not found: ID does not exist" Mar 09 13:25:02 crc kubenswrapper[4703]: I0309 13:25:02.160943 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:25:02 crc kubenswrapper[4703]: I0309 13:25:02.203253 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:25:02 crc kubenswrapper[4703]: I0309 13:25:02.554695 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:25:02 crc kubenswrapper[4703]: I0309 13:25:02.631761 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:25:02 crc kubenswrapper[4703]: I0309 13:25:02.713916 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" path="/var/lib/kubelet/pods/7c0fe86d-3006-44b3-811a-f28758ef07fd/volumes" Mar 09 13:25:02 crc kubenswrapper[4703]: I0309 13:25:02.970823 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pqzw"] Mar 09 13:25:02 crc kubenswrapper[4703]: I0309 13:25:02.971235 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6pqzw" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="registry-server" containerID="cri-o://cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67" gracePeriod=2 Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.458921 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.591045 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-utilities\") pod \"3916d955-3b83-488d-959c-88b09424a3e3\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.591117 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-catalog-content\") pod \"3916d955-3b83-488d-959c-88b09424a3e3\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.591210 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg75f\" (UniqueName: \"kubernetes.io/projected/3916d955-3b83-488d-959c-88b09424a3e3-kube-api-access-fg75f\") pod \"3916d955-3b83-488d-959c-88b09424a3e3\" (UID: \"3916d955-3b83-488d-959c-88b09424a3e3\") " Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.592658 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-utilities" (OuterVolumeSpecName: "utilities") pod "3916d955-3b83-488d-959c-88b09424a3e3" (UID: "3916d955-3b83-488d-959c-88b09424a3e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.596039 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3916d955-3b83-488d-959c-88b09424a3e3-kube-api-access-fg75f" (OuterVolumeSpecName: "kube-api-access-fg75f") pod "3916d955-3b83-488d-959c-88b09424a3e3" (UID: "3916d955-3b83-488d-959c-88b09424a3e3"). InnerVolumeSpecName "kube-api-access-fg75f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.618513 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3916d955-3b83-488d-959c-88b09424a3e3" (UID: "3916d955-3b83-488d-959c-88b09424a3e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.693182 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg75f\" (UniqueName: \"kubernetes.io/projected/3916d955-3b83-488d-959c-88b09424a3e3-kube-api-access-fg75f\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.693499 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.693513 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3916d955-3b83-488d-959c-88b09424a3e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.906919 4703 generic.go:334] "Generic (PLEG): container finished" podID="3916d955-3b83-488d-959c-88b09424a3e3" containerID="cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67" exitCode=0 Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.906969 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pqzw" event={"ID":"3916d955-3b83-488d-959c-88b09424a3e3","Type":"ContainerDied","Data":"cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67"} Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.906993 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pqzw" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.907019 4703 scope.go:117] "RemoveContainer" containerID="cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.907005 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pqzw" event={"ID":"3916d955-3b83-488d-959c-88b09424a3e3","Type":"ContainerDied","Data":"2d20e1063bb57c11f6763c69cc818cdffc5be220dd8425835a0ae66ea41c0f5b"} Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.921464 4703 scope.go:117] "RemoveContainer" containerID="8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.942294 4703 scope.go:117] "RemoveContainer" containerID="5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.948301 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pqzw"] Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.955102 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pqzw"] Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.965227 4703 scope.go:117] "RemoveContainer" containerID="cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67" Mar 09 13:25:03 crc kubenswrapper[4703]: E0309 13:25:03.965712 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67\": container with ID starting with cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67 not found: ID does not exist" containerID="cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.965765 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67"} err="failed to get container status \"cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67\": rpc error: code = NotFound desc = could not find container \"cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67\": container with ID starting with cdbe06f5494203db8660cb9dcb0ddad1b78893036fe12f3c667bb6d5cb731c67 not found: ID does not exist" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.965791 4703 scope.go:117] "RemoveContainer" containerID="8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184" Mar 09 13:25:03 crc kubenswrapper[4703]: E0309 13:25:03.966480 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184\": container with ID starting with 8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184 not found: ID does not exist" containerID="8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.966506 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184"} err="failed to get container status \"8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184\": rpc error: code = NotFound desc = could not find container \"8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184\": container with ID starting with 8b12cfc49694669222dda6e4b22c8d0eff402ac2f8dd5ecaca745065b6e11184 not found: ID does not exist" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.966528 4703 scope.go:117] "RemoveContainer" containerID="5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b" Mar 09 13:25:03 crc kubenswrapper[4703]: E0309 13:25:03.966764 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b\": container with ID starting with 5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b not found: ID does not exist" containerID="5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b" Mar 09 13:25:03 crc kubenswrapper[4703]: I0309 13:25:03.966795 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b"} err="failed to get container status \"5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b\": rpc error: code = NotFound desc = could not find container \"5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b\": container with ID starting with 5d262eea4605e3a13ece8ebc308caa17faf7cbb415905a413665c7e51a50b38b not found: ID does not exist" Mar 09 13:25:04 crc kubenswrapper[4703]: I0309 13:25:04.719909 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3916d955-3b83-488d-959c-88b09424a3e3" path="/var/lib/kubelet/pods/3916d955-3b83-488d-959c-88b09424a3e3/volumes" Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.703683 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fd569c996-k954g"] Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.704191 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" podUID="28cd609c-fb43-4b67-9ffb-1a49482bd6ac" containerName="controller-manager" containerID="cri-o://d02a2634b76f2240d28a3767ab5adfbba2b694a8f5b081bbca709e2f7a39064c" gracePeriod=30 Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.787911 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss"] Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.788164 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" podUID="58cdc96b-614c-4784-b144-dd817f60399c" containerName="route-controller-manager" containerID="cri-o://4ea3f87802b135c036106271a704fb252ba227ca1b4ea25c024a71519d486e84" gracePeriod=30 Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.932008 4703 generic.go:334] "Generic (PLEG): container finished" podID="28cd609c-fb43-4b67-9ffb-1a49482bd6ac" containerID="d02a2634b76f2240d28a3767ab5adfbba2b694a8f5b081bbca709e2f7a39064c" exitCode=0 Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.932079 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" event={"ID":"28cd609c-fb43-4b67-9ffb-1a49482bd6ac","Type":"ContainerDied","Data":"d02a2634b76f2240d28a3767ab5adfbba2b694a8f5b081bbca709e2f7a39064c"} Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.934377 4703 generic.go:334] "Generic (PLEG): container finished" podID="58cdc96b-614c-4784-b144-dd817f60399c" containerID="4ea3f87802b135c036106271a704fb252ba227ca1b4ea25c024a71519d486e84" exitCode=0 Mar 09 13:25:05 crc kubenswrapper[4703]: I0309 13:25:05.934398 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" event={"ID":"58cdc96b-614c-4784-b144-dd817f60399c","Type":"ContainerDied","Data":"4ea3f87802b135c036106271a704fb252ba227ca1b4ea25c024a71519d486e84"} Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.209135 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.235942 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330268 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-config\") pod \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330314 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm4vh\" (UniqueName: \"kubernetes.io/projected/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-kube-api-access-sm4vh\") pod \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330341 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-config\") pod \"58cdc96b-614c-4784-b144-dd817f60399c\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330359 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4r5d\" (UniqueName: \"kubernetes.io/projected/58cdc96b-614c-4784-b144-dd817f60399c-kube-api-access-t4r5d\") pod \"58cdc96b-614c-4784-b144-dd817f60399c\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330382 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-client-ca\") pod \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330399 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-client-ca\") pod \"58cdc96b-614c-4784-b144-dd817f60399c\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330412 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-proxy-ca-bundles\") pod \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330428 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58cdc96b-614c-4784-b144-dd817f60399c-serving-cert\") pod \"58cdc96b-614c-4784-b144-dd817f60399c\" (UID: \"58cdc96b-614c-4784-b144-dd817f60399c\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.330443 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-serving-cert\") pod \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\" (UID: \"28cd609c-fb43-4b67-9ffb-1a49482bd6ac\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.331625 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "28cd609c-fb43-4b67-9ffb-1a49482bd6ac" (UID: "28cd609c-fb43-4b67-9ffb-1a49482bd6ac"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.331740 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-client-ca" (OuterVolumeSpecName: "client-ca") pod "28cd609c-fb43-4b67-9ffb-1a49482bd6ac" (UID: "28cd609c-fb43-4b67-9ffb-1a49482bd6ac"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.331765 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-client-ca" (OuterVolumeSpecName: "client-ca") pod "58cdc96b-614c-4784-b144-dd817f60399c" (UID: "58cdc96b-614c-4784-b144-dd817f60399c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.331853 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-config" (OuterVolumeSpecName: "config") pod "58cdc96b-614c-4784-b144-dd817f60399c" (UID: "58cdc96b-614c-4784-b144-dd817f60399c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.332192 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-config" (OuterVolumeSpecName: "config") pod "28cd609c-fb43-4b67-9ffb-1a49482bd6ac" (UID: "28cd609c-fb43-4b67-9ffb-1a49482bd6ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.335751 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28cd609c-fb43-4b67-9ffb-1a49482bd6ac" (UID: "28cd609c-fb43-4b67-9ffb-1a49482bd6ac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.335831 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cdc96b-614c-4784-b144-dd817f60399c-kube-api-access-t4r5d" (OuterVolumeSpecName: "kube-api-access-t4r5d") pod "58cdc96b-614c-4784-b144-dd817f60399c" (UID: "58cdc96b-614c-4784-b144-dd817f60399c"). InnerVolumeSpecName "kube-api-access-t4r5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.336071 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58cdc96b-614c-4784-b144-dd817f60399c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58cdc96b-614c-4784-b144-dd817f60399c" (UID: "58cdc96b-614c-4784-b144-dd817f60399c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.336967 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-kube-api-access-sm4vh" (OuterVolumeSpecName: "kube-api-access-sm4vh") pod "28cd609c-fb43-4b67-9ffb-1a49482bd6ac" (UID: "28cd609c-fb43-4b67-9ffb-1a49482bd6ac"). InnerVolumeSpecName "kube-api-access-sm4vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.363428 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gkg4j"] Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.363669 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gkg4j" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="registry-server" containerID="cri-o://265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b" gracePeriod=2 Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431240 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431279 4703 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431291 4703 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431303 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58cdc96b-614c-4784-b144-dd817f60399c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431313 4703 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431324 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431335 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm4vh\" (UniqueName: \"kubernetes.io/projected/28cd609c-fb43-4b67-9ffb-1a49482bd6ac-kube-api-access-sm4vh\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431346 4703 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58cdc96b-614c-4784-b144-dd817f60399c-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.431357 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4r5d\" (UniqueName: \"kubernetes.io/projected/58cdc96b-614c-4784-b144-dd817f60399c-kube-api-access-t4r5d\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.739282 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.936673 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glzwf\" (UniqueName: \"kubernetes.io/projected/09a58d8e-1a70-4e6c-b217-1c2b2537accc-kube-api-access-glzwf\") pod \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.936735 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-utilities\") pod \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.936790 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-catalog-content\") pod \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\" (UID: \"09a58d8e-1a70-4e6c-b217-1c2b2537accc\") " Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.937892 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-utilities" (OuterVolumeSpecName: "utilities") pod "09a58d8e-1a70-4e6c-b217-1c2b2537accc" (UID: "09a58d8e-1a70-4e6c-b217-1c2b2537accc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.941780 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a58d8e-1a70-4e6c-b217-1c2b2537accc-kube-api-access-glzwf" (OuterVolumeSpecName: "kube-api-access-glzwf") pod "09a58d8e-1a70-4e6c-b217-1c2b2537accc" (UID: "09a58d8e-1a70-4e6c-b217-1c2b2537accc"). InnerVolumeSpecName "kube-api-access-glzwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.943215 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" event={"ID":"28cd609c-fb43-4b67-9ffb-1a49482bd6ac","Type":"ContainerDied","Data":"36691124b31819182561e68f5bd649d9b25afdc487b902595b7bab3020babdda"} Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.943314 4703 scope.go:117] "RemoveContainer" containerID="d02a2634b76f2240d28a3767ab5adfbba2b694a8f5b081bbca709e2f7a39064c" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.943242 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd569c996-k954g" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.945941 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" event={"ID":"58cdc96b-614c-4784-b144-dd817f60399c","Type":"ContainerDied","Data":"0163b868729b663933a06dcc3de16d0cc18a7f57793042d932baecd53b978704"} Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.946062 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.951136 4703 generic.go:334] "Generic (PLEG): container finished" podID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerID="265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b" exitCode=0 Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.951171 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gkg4j" event={"ID":"09a58d8e-1a70-4e6c-b217-1c2b2537accc","Type":"ContainerDied","Data":"265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b"} Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.951195 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gkg4j" event={"ID":"09a58d8e-1a70-4e6c-b217-1c2b2537accc","Type":"ContainerDied","Data":"5966fc7db128b2e2c9e4ecbd8a1a85d3006725ee4dcc0eaea435151da504fdba"} Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.951270 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gkg4j" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.991139 4703 scope.go:117] "RemoveContainer" containerID="4ea3f87802b135c036106271a704fb252ba227ca1b4ea25c024a71519d486e84" Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.994160 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fd569c996-k954g"] Mar 09 13:25:06 crc kubenswrapper[4703]: I0309 13:25:06.999270 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-fd569c996-k954g"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.001812 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.004542 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d69bdc5c-rmzss"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.011264 4703 scope.go:117] "RemoveContainer" containerID="265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.028511 4703 scope.go:117] "RemoveContainer" containerID="ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.037535 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glzwf\" (UniqueName: \"kubernetes.io/projected/09a58d8e-1a70-4e6c-b217-1c2b2537accc-kube-api-access-glzwf\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.037563 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.046764 4703 scope.go:117] "RemoveContainer" containerID="551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.063521 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09a58d8e-1a70-4e6c-b217-1c2b2537accc" (UID: "09a58d8e-1a70-4e6c-b217-1c2b2537accc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.070127 4703 scope.go:117] "RemoveContainer" containerID="265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.070707 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b\": container with ID starting with 265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b not found: ID does not exist" containerID="265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.070756 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b"} err="failed to get container status \"265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b\": rpc error: code = NotFound desc = could not find container \"265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b\": container with ID starting with 265ebf0960400e6220159eb48249182b4956971c37888f1faefa7c863729700b not found: ID does not exist" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.070777 4703 scope.go:117] "RemoveContainer" containerID="ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.071130 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc\": container with ID starting with ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc not found: ID does not exist" containerID="ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.071166 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc"} err="failed to get container status \"ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc\": rpc error: code = NotFound desc = could not find container \"ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc\": container with ID starting with ec53da7bfbecd5a1a86e9e8b080ef549b6d4fab2bf4d31e6a43f145a261b27cc not found: ID does not exist" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.071193 4703 scope.go:117] "RemoveContainer" containerID="551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.071471 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed\": container with ID starting with 551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed not found: ID does not exist" containerID="551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.071493 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed"} err="failed to get container status \"551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed\": rpc error: code = NotFound desc = could not find container \"551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed\": container with ID starting with 551a6f3e5645ea0713ab184bb1af25788508bbafec11147df236762e6bfe5fed not found: ID does not exist" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.138294 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a58d8e-1a70-4e6c-b217-1c2b2537accc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.288291 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gkg4j"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.293937 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gkg4j"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574447 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f4dd585db-hxvz6"] Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574765 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574782 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574796 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cdc96b-614c-4784-b144-dd817f60399c" containerName="route-controller-manager" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574805 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cdc96b-614c-4784-b144-dd817f60399c" containerName="route-controller-manager" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574816 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574824 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574835 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574859 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574875 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574883 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574894 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574902 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574910 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574919 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574928 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574939 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574953 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574963 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574972 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574980 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.574992 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.574999 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="extract-utilities" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.575009 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575017 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="extract-content" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.575031 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575039 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: E0309 13:25:07.575049 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cd609c-fb43-4b67-9ffb-1a49482bd6ac" containerName="controller-manager" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575059 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cd609c-fb43-4b67-9ffb-1a49482bd6ac" containerName="controller-manager" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575720 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3916d955-3b83-488d-959c-88b09424a3e3" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575743 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0fe86d-3006-44b3-811a-f28758ef07fd" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575756 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cd609c-fb43-4b67-9ffb-1a49482bd6ac" containerName="controller-manager" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575770 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575781 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cdc96b-614c-4784-b144-dd817f60399c" containerName="route-controller-manager" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.575789 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="adee29a3-c626-4dc3-8323-ce1e852faea3" containerName="registry-server" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.576314 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.577819 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.578654 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.582983 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.583980 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.584218 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.584250 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.584255 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.584451 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.584521 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.584670 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.584942 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.585674 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.585876 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.586625 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.592657 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.598415 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.605573 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f4dd585db-hxvz6"] Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.746221 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1965d96-08d3-41a8-bb30-9a0f073b486a-serving-cert\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.746304 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-client-ca\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.746713 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19db10b-114c-4486-b4e1-868813f01a46-serving-cert\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.746909 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19db10b-114c-4486-b4e1-868813f01a46-config\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.747014 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-proxy-ca-bundles\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.747642 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-config\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.747789 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d19db10b-114c-4486-b4e1-868813f01a46-client-ca\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.747841 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dd8p\" (UniqueName: \"kubernetes.io/projected/d19db10b-114c-4486-b4e1-868813f01a46-kube-api-access-5dd8p\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.747923 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jdz\" (UniqueName: \"kubernetes.io/projected/a1965d96-08d3-41a8-bb30-9a0f073b486a-kube-api-access-27jdz\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.849216 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d19db10b-114c-4486-b4e1-868813f01a46-client-ca\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.849592 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dd8p\" (UniqueName: \"kubernetes.io/projected/d19db10b-114c-4486-b4e1-868813f01a46-kube-api-access-5dd8p\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.849924 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jdz\" (UniqueName: \"kubernetes.io/projected/a1965d96-08d3-41a8-bb30-9a0f073b486a-kube-api-access-27jdz\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.850226 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1965d96-08d3-41a8-bb30-9a0f073b486a-serving-cert\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.850487 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-client-ca\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.850782 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19db10b-114c-4486-b4e1-868813f01a46-serving-cert\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.851152 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19db10b-114c-4486-b4e1-868813f01a46-config\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.851525 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-proxy-ca-bundles\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.851813 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-config\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.852699 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-client-ca\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.851206 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d19db10b-114c-4486-b4e1-868813f01a46-client-ca\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.853649 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19db10b-114c-4486-b4e1-868813f01a46-config\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.853941 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-proxy-ca-bundles\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.856282 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1965d96-08d3-41a8-bb30-9a0f073b486a-config\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.856990 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19db10b-114c-4486-b4e1-868813f01a46-serving-cert\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.860687 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1965d96-08d3-41a8-bb30-9a0f073b486a-serving-cert\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.881101 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dd8p\" (UniqueName: \"kubernetes.io/projected/d19db10b-114c-4486-b4e1-868813f01a46-kube-api-access-5dd8p\") pod \"route-controller-manager-68c594f7b7-vcpxc\" (UID: \"d19db10b-114c-4486-b4e1-868813f01a46\") " pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.881538 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jdz\" (UniqueName: \"kubernetes.io/projected/a1965d96-08d3-41a8-bb30-9a0f073b486a-kube-api-access-27jdz\") pod \"controller-manager-5f4dd585db-hxvz6\" (UID: \"a1965d96-08d3-41a8-bb30-9a0f073b486a\") " pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.913055 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:07 crc kubenswrapper[4703]: I0309 13:25:07.920755 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:08 crc kubenswrapper[4703]: W0309 13:25:08.368861 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1965d96_08d3_41a8_bb30_9a0f073b486a.slice/crio-979838a0b9799a40a98512f090ef61e22c7ab323176a45143e035bfda44445d7 WatchSource:0}: Error finding container 979838a0b9799a40a98512f090ef61e22c7ab323176a45143e035bfda44445d7: Status 404 returned error can't find the container with id 979838a0b9799a40a98512f090ef61e22c7ab323176a45143e035bfda44445d7 Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.369476 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f4dd585db-hxvz6"] Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.384203 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc"] Mar 09 13:25:08 crc kubenswrapper[4703]: W0309 13:25:08.391232 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19db10b_114c_4486_b4e1_868813f01a46.slice/crio-15cc52a917da5eef8ca4cd02648788014a4e19057f5b5f0be9419eb021934c38 WatchSource:0}: Error finding container 15cc52a917da5eef8ca4cd02648788014a4e19057f5b5f0be9419eb021934c38: Status 404 returned error can't find the container with id 15cc52a917da5eef8ca4cd02648788014a4e19057f5b5f0be9419eb021934c38 Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.714225 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a58d8e-1a70-4e6c-b217-1c2b2537accc" path="/var/lib/kubelet/pods/09a58d8e-1a70-4e6c-b217-1c2b2537accc/volumes" Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.715441 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cd609c-fb43-4b67-9ffb-1a49482bd6ac" path="/var/lib/kubelet/pods/28cd609c-fb43-4b67-9ffb-1a49482bd6ac/volumes" Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.716124 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58cdc96b-614c-4784-b144-dd817f60399c" path="/var/lib/kubelet/pods/58cdc96b-614c-4784-b144-dd817f60399c/volumes" Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.989798 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" event={"ID":"a1965d96-08d3-41a8-bb30-9a0f073b486a","Type":"ContainerStarted","Data":"0edeee3c45258a6917ae46ca4c78c066027c4402585f754cd52a0382db262124"} Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.989868 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" event={"ID":"a1965d96-08d3-41a8-bb30-9a0f073b486a","Type":"ContainerStarted","Data":"979838a0b9799a40a98512f090ef61e22c7ab323176a45143e035bfda44445d7"} Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.990264 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.991700 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" event={"ID":"d19db10b-114c-4486-b4e1-868813f01a46","Type":"ContainerStarted","Data":"5843c7d41b71e36fe4aab9e9ce481a683b6f2dd72492af0f77b0a1d504045490"} Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.991741 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" event={"ID":"d19db10b-114c-4486-b4e1-868813f01a46","Type":"ContainerStarted","Data":"15cc52a917da5eef8ca4cd02648788014a4e19057f5b5f0be9419eb021934c38"} Mar 09 13:25:08 crc kubenswrapper[4703]: I0309 13:25:08.999101 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" Mar 09 13:25:09 crc kubenswrapper[4703]: I0309 13:25:09.010098 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f4dd585db-hxvz6" podStartSLOduration=4.0100834 podStartE2EDuration="4.0100834s" podCreationTimestamp="2026-03-09 13:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:09.00907495 +0000 UTC m=+304.976490636" watchObservedRunningTime="2026-03-09 13:25:09.0100834 +0000 UTC m=+304.977499086" Mar 09 13:25:09 crc kubenswrapper[4703]: I0309 13:25:09.051874 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" podStartSLOduration=4.051855151 podStartE2EDuration="4.051855151s" podCreationTimestamp="2026-03-09 13:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:09.050344947 +0000 UTC m=+305.017760633" watchObservedRunningTime="2026-03-09 13:25:09.051855151 +0000 UTC m=+305.019270837" Mar 09 13:25:09 crc kubenswrapper[4703]: I0309 13:25:09.998067 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:10 crc kubenswrapper[4703]: I0309 13:25:10.006319 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68c594f7b7-vcpxc" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.970883 4703 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.972791 4703 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.973505 4703 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.973679 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.974011 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734" gracePeriod=15 Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.974025 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99" gracePeriod=15 Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.974059 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1" gracePeriod=15 Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.974070 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0" gracePeriod=15 Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.974054 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156" gracePeriod=15 Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.976774 4703 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977222 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977254 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977270 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977281 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977307 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977325 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977348 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977364 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977388 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977402 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977436 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977448 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977467 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977480 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977493 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977505 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977527 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977539 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:25:11 crc kubenswrapper[4703]: E0309 13:25:11.977556 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977569 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977738 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977758 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977771 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977785 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977821 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977838 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977885 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.977906 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 13:25:11 crc kubenswrapper[4703]: I0309 13:25:11.978265 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 13:25:12 crc kubenswrapper[4703]: E0309 13:25:12.023335 4703 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.19:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.041437 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.041493 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.041616 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.041673 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.041945 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.041995 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.042017 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.042075 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143378 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143668 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143696 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143711 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143743 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143761 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143776 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143796 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143761 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143866 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143485 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143888 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143914 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143897 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143980 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.143998 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: I0309 13:25:12.323913 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:12 crc kubenswrapper[4703]: E0309 13:25:12.377033 4703 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.19:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f248f8f8f48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:25:12.374300488 +0000 UTC m=+308.341716214,LastTimestamp:2026-03-09 13:25:12.374300488 +0000 UTC m=+308.341716214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.029717 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.031771 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.032517 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99" exitCode=0 Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.032552 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1" exitCode=0 Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.032563 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156" exitCode=0 Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.032571 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0" exitCode=2 Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.032615 4703 scope.go:117] "RemoveContainer" containerID="88c4a6a736c92c419dd352614bb391eb2b2c979e768a3af02167ac6362952565" Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.034912 4703 generic.go:334] "Generic (PLEG): container finished" podID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" containerID="6a2e49c053a553a6e957409e956be34e74e84f3363b675ff9fbf1ea135d390f1" exitCode=0 Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.034995 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6bb052e-48ab-43a1-9e9a-c8109a014b1d","Type":"ContainerDied","Data":"6a2e49c053a553a6e957409e956be34e74e84f3363b675ff9fbf1ea135d390f1"} Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.035681 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.038671 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109"} Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.038749 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a272c35547267246352373081c2467bacd362c01059032a4784059b77e821220"} Mar 09 13:25:13 crc kubenswrapper[4703]: I0309 13:25:13.039816 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:13 crc kubenswrapper[4703]: E0309 13:25:13.039865 4703 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.19:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.059684 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:25:14 crc kubenswrapper[4703]: E0309 13:25:14.076100 4703 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.19:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.335439 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.336300 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.337286 4703 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.337775 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383293 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383331 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383383 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383404 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383480 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383510 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383762 4703 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383789 4703 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.383802 4703 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.424027 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.424523 4703 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.425042 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.484981 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kube-api-access\") pod \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.485059 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-var-lock\") pod \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.485082 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kubelet-dir\") pod \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\" (UID: \"f6bb052e-48ab-43a1-9e9a-c8109a014b1d\") " Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.485173 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-var-lock" (OuterVolumeSpecName: "var-lock") pod "f6bb052e-48ab-43a1-9e9a-c8109a014b1d" (UID: "f6bb052e-48ab-43a1-9e9a-c8109a014b1d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.485206 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f6bb052e-48ab-43a1-9e9a-c8109a014b1d" (UID: "f6bb052e-48ab-43a1-9e9a-c8109a014b1d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.485444 4703 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.485460 4703 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.491004 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f6bb052e-48ab-43a1-9e9a-c8109a014b1d" (UID: "f6bb052e-48ab-43a1-9e9a-c8109a014b1d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.586674 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6bb052e-48ab-43a1-9e9a-c8109a014b1d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.709410 4703 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.709778 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:14 crc kubenswrapper[4703]: I0309 13:25:14.718570 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.080428 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6bb052e-48ab-43a1-9e9a-c8109a014b1d","Type":"ContainerDied","Data":"ceecff7ae89257a2e7ad3ed4dfd6583e466e8e6a4055a25f6c3cf93d0ce99348"} Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.080474 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceecff7ae89257a2e7ad3ed4dfd6583e466e8e6a4055a25f6c3cf93d0ce99348" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.080446 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.083775 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.085962 4703 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734" exitCode=0 Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.086039 4703 scope.go:117] "RemoveContainer" containerID="5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.086045 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.086161 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.086798 4703 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.087085 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.088869 4703 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.089646 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.127539 4703 scope.go:117] "RemoveContainer" containerID="d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.141180 4703 scope.go:117] "RemoveContainer" containerID="78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.158630 4703 scope.go:117] "RemoveContainer" containerID="32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.174572 4703 scope.go:117] "RemoveContainer" containerID="681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.192964 4703 scope.go:117] "RemoveContainer" containerID="911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.211332 4703 scope.go:117] "RemoveContainer" containerID="5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99" Mar 09 13:25:15 crc kubenswrapper[4703]: E0309 13:25:15.211752 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\": container with ID starting with 5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99 not found: ID does not exist" containerID="5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.211803 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99"} err="failed to get container status \"5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\": rpc error: code = NotFound desc = could not find container \"5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99\": container with ID starting with 5dcfcaf853d2313a71d0c5bcf8de7e88fb51d462ad8cefd5adb72e1c2cc1ca99 not found: ID does not exist" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.211831 4703 scope.go:117] "RemoveContainer" containerID="d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1" Mar 09 13:25:15 crc kubenswrapper[4703]: E0309 13:25:15.212313 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\": container with ID starting with d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1 not found: ID does not exist" containerID="d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.212356 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1"} err="failed to get container status \"d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\": rpc error: code = NotFound desc = could not find container \"d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1\": container with ID starting with d04a931cb24ae45e0308773b198648b386ad3c9d1f6da5fc7eaf2328eb00e3e1 not found: ID does not exist" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.212399 4703 scope.go:117] "RemoveContainer" containerID="78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156" Mar 09 13:25:15 crc kubenswrapper[4703]: E0309 13:25:15.212621 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\": container with ID starting with 78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156 not found: ID does not exist" containerID="78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.212649 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156"} err="failed to get container status \"78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\": rpc error: code = NotFound desc = could not find container \"78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156\": container with ID starting with 78a131f1491326545def3e7004c4a4fb49a118dc6ee82c61ce29877f5348e156 not found: ID does not exist" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.212664 4703 scope.go:117] "RemoveContainer" containerID="32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0" Mar 09 13:25:15 crc kubenswrapper[4703]: E0309 13:25:15.213044 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\": container with ID starting with 32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0 not found: ID does not exist" containerID="32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.213064 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0"} err="failed to get container status \"32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\": rpc error: code = NotFound desc = could not find container \"32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0\": container with ID starting with 32374e0e06704b0613bdb8f7472df94ecf93ce9ff92f4e62e3a6ed7dbaa671e0 not found: ID does not exist" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.213100 4703 scope.go:117] "RemoveContainer" containerID="681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734" Mar 09 13:25:15 crc kubenswrapper[4703]: E0309 13:25:15.213367 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\": container with ID starting with 681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734 not found: ID does not exist" containerID="681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.213393 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734"} err="failed to get container status \"681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\": rpc error: code = NotFound desc = could not find container \"681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734\": container with ID starting with 681ccbcdf4e8b6a147152c8d5f181c328a16d12932ad8ab60efbbfeec41d2734 not found: ID does not exist" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.213427 4703 scope.go:117] "RemoveContainer" containerID="911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54" Mar 09 13:25:15 crc kubenswrapper[4703]: E0309 13:25:15.213663 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\": container with ID starting with 911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54 not found: ID does not exist" containerID="911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54" Mar 09 13:25:15 crc kubenswrapper[4703]: I0309 13:25:15.213694 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54"} err="failed to get container status \"911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\": rpc error: code = NotFound desc = could not find container \"911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54\": container with ID starting with 911c739b095a188597ed5f5eb62fcc701ac953cfa1657d4c44d21336fd0aec54 not found: ID does not exist" Mar 09 13:25:15 crc kubenswrapper[4703]: E0309 13:25:15.467974 4703 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.19:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b2f248f8f8f48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 13:25:12.374300488 +0000 UTC m=+308.341716214,LastTimestamp:2026-03-09 13:25:12.374300488 +0000 UTC m=+308.341716214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 13:25:16 crc kubenswrapper[4703]: E0309 13:25:16.710717 4703 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.19:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" volumeName="registry-storage" Mar 09 13:25:20 crc kubenswrapper[4703]: E0309 13:25:20.958279 4703 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:20 crc kubenswrapper[4703]: E0309 13:25:20.960046 4703 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:20 crc kubenswrapper[4703]: E0309 13:25:20.960635 4703 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:20 crc kubenswrapper[4703]: E0309 13:25:20.961272 4703 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:20 crc kubenswrapper[4703]: E0309 13:25:20.961586 4703 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:20 crc kubenswrapper[4703]: I0309 13:25:20.961614 4703 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 13:25:20 crc kubenswrapper[4703]: E0309 13:25:20.961860 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="200ms" Mar 09 13:25:21 crc kubenswrapper[4703]: E0309 13:25:21.162969 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="400ms" Mar 09 13:25:21 crc kubenswrapper[4703]: E0309 13:25:21.564085 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="800ms" Mar 09 13:25:22 crc kubenswrapper[4703]: E0309 13:25:22.364824 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="1.6s" Mar 09 13:25:23 crc kubenswrapper[4703]: I0309 13:25:23.706874 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:23 crc kubenswrapper[4703]: I0309 13:25:23.708145 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:23 crc kubenswrapper[4703]: I0309 13:25:23.731556 4703 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:23 crc kubenswrapper[4703]: I0309 13:25:23.731619 4703 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:23 crc kubenswrapper[4703]: E0309 13:25:23.732357 4703 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:23 crc kubenswrapper[4703]: I0309 13:25:23.733230 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:23 crc kubenswrapper[4703]: W0309 13:25:23.773082 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-52351ec99f950af1746a78f405cca1eb3e85f46dca64f7db4785faab3e0a8bfd WatchSource:0}: Error finding container 52351ec99f950af1746a78f405cca1eb3e85f46dca64f7db4785faab3e0a8bfd: Status 404 returned error can't find the container with id 52351ec99f950af1746a78f405cca1eb3e85f46dca64f7db4785faab3e0a8bfd Mar 09 13:25:23 crc kubenswrapper[4703]: E0309 13:25:23.965929 4703 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.19:6443: connect: connection refused" interval="3.2s" Mar 09 13:25:24 crc kubenswrapper[4703]: I0309 13:25:24.182799 4703 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1f69353dad4538c6b09ad73634e50457f388a1e062f1123b1b7da5da41ee06e5" exitCode=0 Mar 09 13:25:24 crc kubenswrapper[4703]: I0309 13:25:24.182906 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1f69353dad4538c6b09ad73634e50457f388a1e062f1123b1b7da5da41ee06e5"} Mar 09 13:25:24 crc kubenswrapper[4703]: I0309 13:25:24.182968 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"52351ec99f950af1746a78f405cca1eb3e85f46dca64f7db4785faab3e0a8bfd"} Mar 09 13:25:24 crc kubenswrapper[4703]: I0309 13:25:24.183239 4703 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:24 crc kubenswrapper[4703]: I0309 13:25:24.183256 4703 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:24 crc kubenswrapper[4703]: E0309 13:25:24.183729 4703 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:24 crc kubenswrapper[4703]: I0309 13:25:24.183970 4703 status_manager.go:851] "Failed to get status for pod" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.19:6443: connect: connection refused" Mar 09 13:25:25 crc kubenswrapper[4703]: I0309 13:25:25.192284 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b37f960cb91acb42fccacbde909fae3a416d2e3384632e28bd15cac37146af2a"} Mar 09 13:25:25 crc kubenswrapper[4703]: I0309 13:25:25.192334 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"78de7ec528d08d42030c25bc3e3a2f60b254e10eb408e5bd0d58a1ea65c9d86a"} Mar 09 13:25:25 crc kubenswrapper[4703]: I0309 13:25:25.192348 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"469b97c20af2e6cb40362dc2e1b28b676b94ae222986870dcd1f5d6ca1eff277"} Mar 09 13:25:26 crc kubenswrapper[4703]: I0309 13:25:26.202349 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"588ffb3845a5ac5462f55de795add128fcf615488630e4b156ce56a7add8ad98"} Mar 09 13:25:26 crc kubenswrapper[4703]: I0309 13:25:26.202670 4703 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:26 crc kubenswrapper[4703]: I0309 13:25:26.202694 4703 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:26 crc kubenswrapper[4703]: I0309 13:25:26.202700 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"431b37b5185e9a3ef50dd981b277e3eaafa5b50e74b55cfc02ee73992e7db049"} Mar 09 13:25:26 crc kubenswrapper[4703]: I0309 13:25:26.202727 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:26 crc kubenswrapper[4703]: I0309 13:25:26.444505 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" podUID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" containerName="oauth-openshift" containerID="cri-o://60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2" gracePeriod=15 Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.028234 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.039778 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-login\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.039937 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-dir\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040001 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czggb\" (UniqueName: \"kubernetes.io/projected/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-kube-api-access-czggb\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040025 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040037 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-ocp-branding-template\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040066 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-session\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040105 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-provider-selection\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040125 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-router-certs\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040151 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-serving-cert\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040173 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-policies\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040365 4703 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.040800 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.045512 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-kube-api-access-czggb" (OuterVolumeSpecName: "kube-api-access-czggb") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "kube-api-access-czggb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.045573 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.045833 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.050635 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.051212 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.051855 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.057029 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.140869 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-cliconfig\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.140909 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-trusted-ca-bundle\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.140931 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-idp-0-file-data\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.140947 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-service-ca\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.140972 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-error\") pod \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\" (UID: \"20cb1ef8-0711-4f38-a0aa-3a8a3953951e\") " Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141132 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141144 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141154 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141162 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141171 4703 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141179 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141188 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czggb\" (UniqueName: \"kubernetes.io/projected/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-kube-api-access-czggb\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141198 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141445 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.141656 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.142021 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.144061 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.144210 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "20cb1ef8-0711-4f38-a0aa-3a8a3953951e" (UID: "20cb1ef8-0711-4f38-a0aa-3a8a3953951e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.209951 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.211183 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.211222 4703 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16" exitCode=1 Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.211276 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16"} Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.212087 4703 scope.go:117] "RemoveContainer" containerID="3e3485dc1ed7496fc861abc4578735829d38eb0dedae3210718362c42783bb16" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.212591 4703 generic.go:334] "Generic (PLEG): container finished" podID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" containerID="60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2" exitCode=0 Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.212610 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" event={"ID":"20cb1ef8-0711-4f38-a0aa-3a8a3953951e","Type":"ContainerDied","Data":"60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2"} Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.212625 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" event={"ID":"20cb1ef8-0711-4f38-a0aa-3a8a3953951e","Type":"ContainerDied","Data":"4cf4a9861fb51fa6d8f77320b367d5cfc2f3823ced978a458f66fbd6dadb1ea5"} Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.212639 4703 scope.go:117] "RemoveContainer" containerID="60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.212913 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ghbk9" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.241916 4703 scope.go:117] "RemoveContainer" containerID="60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.241969 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.241996 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.242006 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.242015 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.242025 4703 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20cb1ef8-0711-4f38-a0aa-3a8a3953951e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:27 crc kubenswrapper[4703]: E0309 13:25:27.242440 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2\": container with ID starting with 60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2 not found: ID does not exist" containerID="60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2" Mar 09 13:25:27 crc kubenswrapper[4703]: I0309 13:25:27.242476 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2"} err="failed to get container status \"60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2\": rpc error: code = NotFound desc = could not find container \"60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2\": container with ID starting with 60971fa0f9dcb0c59be4eb5624df93eaf80634510d4736072cdeafe4702940f2 not found: ID does not exist" Mar 09 13:25:28 crc kubenswrapper[4703]: I0309 13:25:28.134040 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:25:28 crc kubenswrapper[4703]: I0309 13:25:28.225169 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 13:25:28 crc kubenswrapper[4703]: I0309 13:25:28.226633 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 13:25:28 crc kubenswrapper[4703]: I0309 13:25:28.226759 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3920f177bcef9bd7178dda67ced4163ac89956d90a1cf32e14a143d431ae6743"} Mar 09 13:25:28 crc kubenswrapper[4703]: I0309 13:25:28.733749 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:28 crc kubenswrapper[4703]: I0309 13:25:28.733824 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:28 crc kubenswrapper[4703]: I0309 13:25:28.743024 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:31 crc kubenswrapper[4703]: I0309 13:25:31.211887 4703 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:31 crc kubenswrapper[4703]: I0309 13:25:31.246143 4703 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:31 crc kubenswrapper[4703]: I0309 13:25:31.246174 4703 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:31 crc kubenswrapper[4703]: I0309 13:25:31.252260 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:31 crc kubenswrapper[4703]: I0309 13:25:31.254521 4703 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047e1925-dd63-4e53-9949-cd0fc98bb64c" Mar 09 13:25:32 crc kubenswrapper[4703]: I0309 13:25:32.251681 4703 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:32 crc kubenswrapper[4703]: I0309 13:25:32.251725 4703 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:34 crc kubenswrapper[4703]: I0309 13:25:34.062469 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:25:34 crc kubenswrapper[4703]: I0309 13:25:34.069578 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:25:34 crc kubenswrapper[4703]: I0309 13:25:34.266459 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:25:34 crc kubenswrapper[4703]: I0309 13:25:34.720016 4703 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="047e1925-dd63-4e53-9949-cd0fc98bb64c" Mar 09 13:25:37 crc kubenswrapper[4703]: I0309 13:25:37.330769 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 13:25:37 crc kubenswrapper[4703]: I0309 13:25:37.994281 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 13:25:38 crc kubenswrapper[4703]: I0309 13:25:38.142304 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 13:25:39 crc kubenswrapper[4703]: I0309 13:25:39.476566 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 13:25:41 crc kubenswrapper[4703]: I0309 13:25:41.022180 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 13:25:41 crc kubenswrapper[4703]: I0309 13:25:41.244445 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 13:25:41 crc kubenswrapper[4703]: I0309 13:25:41.626416 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 13:25:42 crc kubenswrapper[4703]: I0309 13:25:42.497682 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 13:25:43 crc kubenswrapper[4703]: I0309 13:25:43.186572 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 13:25:43 crc kubenswrapper[4703]: I0309 13:25:43.188038 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 13:25:43 crc kubenswrapper[4703]: I0309 13:25:43.278961 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 13:25:43 crc kubenswrapper[4703]: I0309 13:25:43.422677 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 13:25:43 crc kubenswrapper[4703]: I0309 13:25:43.518619 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 13:25:43 crc kubenswrapper[4703]: I0309 13:25:43.804662 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 13:25:43 crc kubenswrapper[4703]: I0309 13:25:43.845143 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 13:25:44 crc kubenswrapper[4703]: I0309 13:25:44.037242 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 13:25:44 crc kubenswrapper[4703]: I0309 13:25:44.110074 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:25:44 crc kubenswrapper[4703]: I0309 13:25:44.181293 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 13:25:44 crc kubenswrapper[4703]: I0309 13:25:44.200964 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 13:25:44 crc kubenswrapper[4703]: I0309 13:25:44.518829 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 13:25:44 crc kubenswrapper[4703]: I0309 13:25:44.894607 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 13:25:44 crc kubenswrapper[4703]: I0309 13:25:44.939801 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.074477 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.180122 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.191262 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.214433 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.270615 4703 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.355295 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.365143 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.396357 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.500401 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.534104 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.584741 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.880650 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 13:25:45 crc kubenswrapper[4703]: I0309 13:25:45.916573 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.057905 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.089070 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.091078 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.183598 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.476291 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.497738 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.533430 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.579674 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.608647 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.646372 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.797023 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.797138 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.928598 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.969783 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 13:25:46 crc kubenswrapper[4703]: I0309 13:25:46.993377 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.135575 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.212383 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.219414 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.292151 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.326177 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.333055 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.375680 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.379555 4703 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.384425 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.385337 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ghbk9","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.385406 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.385831 4703 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.385902 4703 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c533cd6b-0f98-4b0f-b515-cced796ce36f" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.391205 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.413710 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.413691667 podStartE2EDuration="16.413691667s" podCreationTimestamp="2026-03-09 13:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:25:47.411662048 +0000 UTC m=+343.379077784" watchObservedRunningTime="2026-03-09 13:25:47.413691667 +0000 UTC m=+343.381107363" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.431482 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.589359 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 13:25:47 crc kubenswrapper[4703]: I0309 13:25:47.929649 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.018663 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.095578 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.125092 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.151239 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.220821 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.238937 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.277485 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.286320 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.342200 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.354268 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.367070 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.393048 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.517669 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.564210 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.598109 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.600441 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.660978 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.699952 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.715589 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" path="/var/lib/kubelet/pods/20cb1ef8-0711-4f38-a0aa-3a8a3953951e/volumes" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.770137 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.776297 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.855146 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.890920 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.916578 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.932601 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.956739 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.957966 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 13:25:48 crc kubenswrapper[4703]: I0309 13:25:48.967685 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.079260 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.084020 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.271549 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.577398 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.662198 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.681415 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.702285 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.748198 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.794629 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.838413 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.839935 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.886686 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 13:25:49 crc kubenswrapper[4703]: I0309 13:25:49.993804 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.012976 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.043008 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.052312 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.057125 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.096523 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.115036 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.144322 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.179675 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.196152 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.200587 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.219189 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.242002 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.253339 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.263338 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.351762 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.580610 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.850113 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.929545 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.966592 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 13:25:50 crc kubenswrapper[4703]: I0309 13:25:50.967616 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.026298 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.109628 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.129327 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.154045 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.196098 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.316497 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.324245 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.460574 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.489983 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.546535 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.561492 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.613539 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.687195 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.710839 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.763690 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.787327 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.813464 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.817164 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.826901 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.854908 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.951960 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 13:25:51 crc kubenswrapper[4703]: I0309 13:25:51.971135 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.041652 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.056337 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.098555 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.118178 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.169578 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.223146 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.272917 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.336906 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.416471 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.526733 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.686255 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.690929 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.697109 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.765193 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.773279 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.825137 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.864474 4703 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.890682 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.913330 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.938629 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 13:25:52 crc kubenswrapper[4703]: I0309 13:25:52.969681 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.025888 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.126096 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.129017 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.140268 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.195770 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.272410 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.308002 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.354863 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.405123 4703 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.465563 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.542797 4703 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.543158 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109" gracePeriod=5 Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.587494 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.645311 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.673682 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.683002 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 13:25:53 crc kubenswrapper[4703]: I0309 13:25:53.878671 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.262382 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.340170 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.473381 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.670142 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.693715 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.698508 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.774899 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.799220 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.830228 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.909082 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.913270 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 13:25:54 crc kubenswrapper[4703]: I0309 13:25:54.914936 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.080627 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.110213 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.119638 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.130141 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.191327 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.277344 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.464191 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.480938 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.487699 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.532409 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.688672 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.782068 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.810274 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.836102 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 13:25:55 crc kubenswrapper[4703]: I0309 13:25:55.909766 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.071574 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.123128 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.177079 4703 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.268145 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.310918 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.481558 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.576607 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.606257 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.738295 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.739213 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.885357 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 13:25:56 crc kubenswrapper[4703]: I0309 13:25:56.918170 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.035987 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.145408 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.147618 4703 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.209528 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.311555 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.322349 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.435880 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 13:25:57 crc kubenswrapper[4703]: I0309 13:25:57.859353 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.118414 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.321680 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.477395 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.598981 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.614685 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-679ddc4df6-smm7g"] Mar 09 13:25:58 crc kubenswrapper[4703]: E0309 13:25:58.615050 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" containerName="installer" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.615071 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" containerName="installer" Mar 09 13:25:58 crc kubenswrapper[4703]: E0309 13:25:58.615095 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" containerName="oauth-openshift" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.615110 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" containerName="oauth-openshift" Mar 09 13:25:58 crc kubenswrapper[4703]: E0309 13:25:58.615134 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.615149 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.615363 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bb052e-48ab-43a1-9e9a-c8109a014b1d" containerName="installer" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.615393 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.615415 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cb1ef8-0711-4f38-a0aa-3a8a3953951e" containerName="oauth-openshift" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.616032 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.619031 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.619135 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.619259 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.620245 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.620315 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.620654 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.620798 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.620900 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.620956 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.623224 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.627002 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.627102 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.631750 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.633239 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679ddc4df6-smm7g"] Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.640465 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.647379 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.653635 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-router-certs\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.653688 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-session\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.653717 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-error\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.653748 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.653769 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzw2s\" (UniqueName: \"kubernetes.io/projected/a539e27c-908a-4b60-afb6-25d37aad3574-kube-api-access-nzw2s\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.653795 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.653923 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.654004 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.654027 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-service-ca\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.654062 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a539e27c-908a-4b60-afb6-25d37aad3574-audit-dir\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.654091 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-login\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.654111 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-audit-policies\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.654183 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.654211 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754472 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754510 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754537 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-router-certs\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754558 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-session\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754576 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-error\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754593 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754610 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzw2s\" (UniqueName: \"kubernetes.io/projected/a539e27c-908a-4b60-afb6-25d37aad3574-kube-api-access-nzw2s\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754628 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754648 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754673 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754688 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-service-ca\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754710 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a539e27c-908a-4b60-afb6-25d37aad3574-audit-dir\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754731 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-login\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.754746 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-audit-policies\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.755543 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a539e27c-908a-4b60-afb6-25d37aad3574-audit-dir\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.756067 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-audit-policies\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.756999 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.757398 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.757943 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-service-ca\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.760091 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-error\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.760368 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-session\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.761107 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-router-certs\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.761498 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.762718 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.767591 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-login\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.772163 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.775445 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a539e27c-908a-4b60-afb6-25d37aad3574-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.776636 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzw2s\" (UniqueName: \"kubernetes.io/projected/a539e27c-908a-4b60-afb6-25d37aad3574-kube-api-access-nzw2s\") pod \"oauth-openshift-679ddc4df6-smm7g\" (UID: \"a539e27c-908a-4b60-afb6-25d37aad3574\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.937232 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 13:25:58 crc kubenswrapper[4703]: I0309 13:25:58.987918 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.133699 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.143752 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.143824 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.159937 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.159992 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.160028 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.160075 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.160125 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.160402 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.160882 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.160949 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.160974 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.167197 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.227742 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.262162 4703 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.262230 4703 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.262267 4703 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.262280 4703 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.262292 4703 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.304535 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.420687 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.420772 4703 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109" exitCode=137 Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.420835 4703 scope.go:117] "RemoveContainer" containerID="7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.420940 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.455494 4703 scope.go:117] "RemoveContainer" containerID="7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109" Mar 09 13:25:59 crc kubenswrapper[4703]: E0309 13:25:59.456135 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109\": container with ID starting with 7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109 not found: ID does not exist" containerID="7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.456189 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109"} err="failed to get container status \"7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109\": rpc error: code = NotFound desc = could not find container \"7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109\": container with ID starting with 7754ffdce12b36f96382589a26c21e1487056d5b651b3cf789fd2d91f3143109 not found: ID does not exist" Mar 09 13:25:59 crc kubenswrapper[4703]: I0309 13:25:59.475431 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679ddc4df6-smm7g"] Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.050035 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.079570 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.173599 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551046-r6sbp"] Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.174573 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.179010 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-r6sbp"] Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.179832 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.180239 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.181152 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.276116 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzdc\" (UniqueName: \"kubernetes.io/projected/e289a811-bb2f-43bb-9543-5ae330b86a91-kube-api-access-spzdc\") pod \"auto-csr-approver-29551046-r6sbp\" (UID: \"e289a811-bb2f-43bb-9543-5ae330b86a91\") " pod="openshift-infra/auto-csr-approver-29551046-r6sbp" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.377603 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spzdc\" (UniqueName: \"kubernetes.io/projected/e289a811-bb2f-43bb-9543-5ae330b86a91-kube-api-access-spzdc\") pod \"auto-csr-approver-29551046-r6sbp\" (UID: \"e289a811-bb2f-43bb-9543-5ae330b86a91\") " pod="openshift-infra/auto-csr-approver-29551046-r6sbp" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.398380 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzdc\" (UniqueName: \"kubernetes.io/projected/e289a811-bb2f-43bb-9543-5ae330b86a91-kube-api-access-spzdc\") pod \"auto-csr-approver-29551046-r6sbp\" (UID: \"e289a811-bb2f-43bb-9543-5ae330b86a91\") " pod="openshift-infra/auto-csr-approver-29551046-r6sbp" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.428953 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" event={"ID":"a539e27c-908a-4b60-afb6-25d37aad3574","Type":"ContainerStarted","Data":"c32b1ee3934ec5595a1e1a89c1241daffafd0d420e2df3fd9c93ee026fcf9998"} Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.429042 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" event={"ID":"a539e27c-908a-4b60-afb6-25d37aad3574","Type":"ContainerStarted","Data":"a737d03b5f3cd3c75c26157ef719dbb4f599adb9d7a08665716b38b03b569c58"} Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.429083 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.434929 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.454654 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-679ddc4df6-smm7g" podStartSLOduration=59.454630974 podStartE2EDuration="59.454630974s" podCreationTimestamp="2026-03-09 13:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:26:00.448548477 +0000 UTC m=+356.415964163" watchObservedRunningTime="2026-03-09 13:26:00.454630974 +0000 UTC m=+356.422046700" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.489820 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.712344 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.842434 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 13:26:00 crc kubenswrapper[4703]: I0309 13:26:00.903667 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-r6sbp"] Mar 09 13:26:00 crc kubenswrapper[4703]: W0309 13:26:00.912015 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode289a811_bb2f_43bb_9543_5ae330b86a91.slice/crio-f055edc9588450b864206caa3b5933ffce611848eeb5107841b96c5ca480437b WatchSource:0}: Error finding container f055edc9588450b864206caa3b5933ffce611848eeb5107841b96c5ca480437b: Status 404 returned error can't find the container with id f055edc9588450b864206caa3b5933ffce611848eeb5107841b96c5ca480437b Mar 09 13:26:01 crc kubenswrapper[4703]: I0309 13:26:01.442729 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" event={"ID":"e289a811-bb2f-43bb-9543-5ae330b86a91","Type":"ContainerStarted","Data":"f055edc9588450b864206caa3b5933ffce611848eeb5107841b96c5ca480437b"} Mar 09 13:26:02 crc kubenswrapper[4703]: I0309 13:26:02.450275 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" event={"ID":"e289a811-bb2f-43bb-9543-5ae330b86a91","Type":"ContainerStarted","Data":"be6514ee01d072cd70cfc70248b6f6fd3d19c1b9adb5ab1428c0fae3347666f3"} Mar 09 13:26:02 crc kubenswrapper[4703]: I0309 13:26:02.468656 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" podStartSLOduration=1.3266992420000001 podStartE2EDuration="2.46863728s" podCreationTimestamp="2026-03-09 13:26:00 +0000 UTC" firstStartedPulling="2026-03-09 13:26:00.923086254 +0000 UTC m=+356.890501950" lastFinishedPulling="2026-03-09 13:26:02.065024262 +0000 UTC m=+358.032439988" observedRunningTime="2026-03-09 13:26:02.466092276 +0000 UTC m=+358.433507972" watchObservedRunningTime="2026-03-09 13:26:02.46863728 +0000 UTC m=+358.436052976" Mar 09 13:26:03 crc kubenswrapper[4703]: I0309 13:26:03.463065 4703 generic.go:334] "Generic (PLEG): container finished" podID="e289a811-bb2f-43bb-9543-5ae330b86a91" containerID="be6514ee01d072cd70cfc70248b6f6fd3d19c1b9adb5ab1428c0fae3347666f3" exitCode=0 Mar 09 13:26:03 crc kubenswrapper[4703]: I0309 13:26:03.464048 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" event={"ID":"e289a811-bb2f-43bb-9543-5ae330b86a91","Type":"ContainerDied","Data":"be6514ee01d072cd70cfc70248b6f6fd3d19c1b9adb5ab1428c0fae3347666f3"} Mar 09 13:26:04 crc kubenswrapper[4703]: I0309 13:26:04.851474 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" Mar 09 13:26:04 crc kubenswrapper[4703]: I0309 13:26:04.944910 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spzdc\" (UniqueName: \"kubernetes.io/projected/e289a811-bb2f-43bb-9543-5ae330b86a91-kube-api-access-spzdc\") pod \"e289a811-bb2f-43bb-9543-5ae330b86a91\" (UID: \"e289a811-bb2f-43bb-9543-5ae330b86a91\") " Mar 09 13:26:04 crc kubenswrapper[4703]: I0309 13:26:04.950230 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e289a811-bb2f-43bb-9543-5ae330b86a91-kube-api-access-spzdc" (OuterVolumeSpecName: "kube-api-access-spzdc") pod "e289a811-bb2f-43bb-9543-5ae330b86a91" (UID: "e289a811-bb2f-43bb-9543-5ae330b86a91"). InnerVolumeSpecName "kube-api-access-spzdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:26:05 crc kubenswrapper[4703]: I0309 13:26:05.047085 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spzdc\" (UniqueName: \"kubernetes.io/projected/e289a811-bb2f-43bb-9543-5ae330b86a91-kube-api-access-spzdc\") on node \"crc\" DevicePath \"\"" Mar 09 13:26:05 crc kubenswrapper[4703]: I0309 13:26:05.476761 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" event={"ID":"e289a811-bb2f-43bb-9543-5ae330b86a91","Type":"ContainerDied","Data":"f055edc9588450b864206caa3b5933ffce611848eeb5107841b96c5ca480437b"} Mar 09 13:26:05 crc kubenswrapper[4703]: I0309 13:26:05.476802 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f055edc9588450b864206caa3b5933ffce611848eeb5107841b96c5ca480437b" Mar 09 13:26:05 crc kubenswrapper[4703]: I0309 13:26:05.476909 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551046-r6sbp" Mar 09 13:26:19 crc kubenswrapper[4703]: I0309 13:26:19.549369 4703 generic.go:334] "Generic (PLEG): container finished" podID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerID="a2b79b08c5993f1fa4d2576a57241202d122089aaed8847862fa8260cb159d13" exitCode=0 Mar 09 13:26:19 crc kubenswrapper[4703]: I0309 13:26:19.549451 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" event={"ID":"47af2971-9c78-4d96-a31f-8b77ea0adca2","Type":"ContainerDied","Data":"a2b79b08c5993f1fa4d2576a57241202d122089aaed8847862fa8260cb159d13"} Mar 09 13:26:19 crc kubenswrapper[4703]: I0309 13:26:19.550375 4703 scope.go:117] "RemoveContainer" containerID="a2b79b08c5993f1fa4d2576a57241202d122089aaed8847862fa8260cb159d13" Mar 09 13:26:20 crc kubenswrapper[4703]: I0309 13:26:20.559730 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" event={"ID":"47af2971-9c78-4d96-a31f-8b77ea0adca2","Type":"ContainerStarted","Data":"08702aaaa4cfd7bbe4b728b3bbdfe510536137c311c37c0b3f5c08daba806266"} Mar 09 13:26:20 crc kubenswrapper[4703]: I0309 13:26:20.560427 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:26:20 crc kubenswrapper[4703]: I0309 13:26:20.562353 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:27:09 crc kubenswrapper[4703]: I0309 13:27:09.499976 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:27:09 crc kubenswrapper[4703]: I0309 13:27:09.500760 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.567456 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mgmz6"] Mar 09 13:27:23 crc kubenswrapper[4703]: E0309 13:27:23.568284 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e289a811-bb2f-43bb-9543-5ae330b86a91" containerName="oc" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.568300 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e289a811-bb2f-43bb-9543-5ae330b86a91" containerName="oc" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.568419 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e289a811-bb2f-43bb-9543-5ae330b86a91" containerName="oc" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.568879 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.593140 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mgmz6"] Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701065 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-registry-certificates\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701114 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-bound-sa-token\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701145 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-registry-tls\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701306 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701361 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701512 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701626 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-trusted-ca\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.701681 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwlmt\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-kube-api-access-kwlmt\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.728805 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.803100 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-registry-certificates\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.803166 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-bound-sa-token\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.803215 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-registry-tls\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.803454 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.803500 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.803571 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-trusted-ca\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.803614 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwlmt\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-kube-api-access-kwlmt\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.804256 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.804436 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-registry-certificates\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.805901 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-trusted-ca\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.809858 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-registry-tls\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.810302 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.820272 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-bound-sa-token\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.826056 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwlmt\" (UniqueName: \"kubernetes.io/projected/1b1cf8aa-0073-43ed-83a9-cae1b047dafc-kube-api-access-kwlmt\") pod \"image-registry-66df7c8f76-mgmz6\" (UID: \"1b1cf8aa-0073-43ed-83a9-cae1b047dafc\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:23 crc kubenswrapper[4703]: I0309 13:27:23.885749 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:24 crc kubenswrapper[4703]: W0309 13:27:24.137176 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1cf8aa_0073_43ed_83a9_cae1b047dafc.slice/crio-9a0c169cc3ebe39582d0297ee740435793fcfe725ec23f5ffa2ff29d284b6102 WatchSource:0}: Error finding container 9a0c169cc3ebe39582d0297ee740435793fcfe725ec23f5ffa2ff29d284b6102: Status 404 returned error can't find the container with id 9a0c169cc3ebe39582d0297ee740435793fcfe725ec23f5ffa2ff29d284b6102 Mar 09 13:27:24 crc kubenswrapper[4703]: I0309 13:27:24.144208 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mgmz6"] Mar 09 13:27:24 crc kubenswrapper[4703]: I0309 13:27:24.963089 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" event={"ID":"1b1cf8aa-0073-43ed-83a9-cae1b047dafc","Type":"ContainerStarted","Data":"c62ec75f43d0f22d9ce264e0567600eaaa7f3aa6bd7f350d451cb7c2f3297815"} Mar 09 13:27:24 crc kubenswrapper[4703]: I0309 13:27:24.963756 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" event={"ID":"1b1cf8aa-0073-43ed-83a9-cae1b047dafc","Type":"ContainerStarted","Data":"9a0c169cc3ebe39582d0297ee740435793fcfe725ec23f5ffa2ff29d284b6102"} Mar 09 13:27:24 crc kubenswrapper[4703]: I0309 13:27:24.963842 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:24 crc kubenswrapper[4703]: I0309 13:27:24.988291 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" podStartSLOduration=1.988274745 podStartE2EDuration="1.988274745s" podCreationTimestamp="2026-03-09 13:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:27:24.983601436 +0000 UTC m=+440.951017132" watchObservedRunningTime="2026-03-09 13:27:24.988274745 +0000 UTC m=+440.955690431" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.498565 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zc87r"] Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.502246 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zc87r" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="registry-server" containerID="cri-o://b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108" gracePeriod=30 Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.506622 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kvftn"] Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.507164 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kvftn" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="registry-server" containerID="cri-o://4c52deea3c2497e76ca0e8aeee775c9afd2392c12f3ab422874576d073fda039" gracePeriod=30 Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.524984 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptzrr"] Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.525301 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" containerID="cri-o://08702aaaa4cfd7bbe4b728b3bbdfe510536137c311c37c0b3f5c08daba806266" gracePeriod=30 Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.549935 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5rmg"] Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.550291 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j5rmg" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="registry-server" containerID="cri-o://f85f27e053a633e84612bfec49a94a8ef0d8c405ff58452f07501008893cb1e3" gracePeriod=30 Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.564288 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q7bwg"] Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.577601 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.581236 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m75f5"] Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.581480 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m75f5" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="registry-server" containerID="cri-o://c1649cbb037f0ab5e9da86f0e4b7bc95cc003b46aa66f5a079246e7083ca2b70" gracePeriod=30 Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.587276 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q7bwg"] Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.634748 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73560a56-06ae-4ac9-91b9-4fe8478082e7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.635080 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmm7\" (UniqueName: \"kubernetes.io/projected/73560a56-06ae-4ac9-91b9-4fe8478082e7-kube-api-access-pwmm7\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.635252 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73560a56-06ae-4ac9-91b9-4fe8478082e7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.736915 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73560a56-06ae-4ac9-91b9-4fe8478082e7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.736949 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmm7\" (UniqueName: \"kubernetes.io/projected/73560a56-06ae-4ac9-91b9-4fe8478082e7-kube-api-access-pwmm7\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.736970 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73560a56-06ae-4ac9-91b9-4fe8478082e7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.740063 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73560a56-06ae-4ac9-91b9-4fe8478082e7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.743464 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73560a56-06ae-4ac9-91b9-4fe8478082e7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.760345 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmm7\" (UniqueName: \"kubernetes.io/projected/73560a56-06ae-4ac9-91b9-4fe8478082e7-kube-api-access-pwmm7\") pod \"marketplace-operator-79b997595-q7bwg\" (UID: \"73560a56-06ae-4ac9-91b9-4fe8478082e7\") " pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.936905 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:33 crc kubenswrapper[4703]: I0309 13:27:33.951522 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.016278 4703 generic.go:334] "Generic (PLEG): container finished" podID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerID="f85f27e053a633e84612bfec49a94a8ef0d8c405ff58452f07501008893cb1e3" exitCode=0 Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.016339 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5rmg" event={"ID":"a7eceeab-0758-4f0f-88c7-5b744fbab868","Type":"ContainerDied","Data":"f85f27e053a633e84612bfec49a94a8ef0d8c405ff58452f07501008893cb1e3"} Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.022606 4703 generic.go:334] "Generic (PLEG): container finished" podID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerID="c1649cbb037f0ab5e9da86f0e4b7bc95cc003b46aa66f5a079246e7083ca2b70" exitCode=0 Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.022746 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m75f5" event={"ID":"75d9c9d6-9701-4627-a87b-b2ae669a0eae","Type":"ContainerDied","Data":"c1649cbb037f0ab5e9da86f0e4b7bc95cc003b46aa66f5a079246e7083ca2b70"} Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.026489 4703 generic.go:334] "Generic (PLEG): container finished" podID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerID="08702aaaa4cfd7bbe4b728b3bbdfe510536137c311c37c0b3f5c08daba806266" exitCode=0 Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.026535 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" event={"ID":"47af2971-9c78-4d96-a31f-8b77ea0adca2","Type":"ContainerDied","Data":"08702aaaa4cfd7bbe4b728b3bbdfe510536137c311c37c0b3f5c08daba806266"} Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.026569 4703 scope.go:117] "RemoveContainer" containerID="a2b79b08c5993f1fa4d2576a57241202d122089aaed8847862fa8260cb159d13" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.036521 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.038373 4703 generic.go:334] "Generic (PLEG): container finished" podID="0ab69d10-042e-422a-830e-65d3d1132197" containerID="b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108" exitCode=0 Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.038425 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerDied","Data":"b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108"} Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.038446 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc87r" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.038453 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc87r" event={"ID":"0ab69d10-042e-422a-830e-65d3d1132197","Type":"ContainerDied","Data":"d005c1640c941b89e9f88e52bcb4d0cd3aafdf1fe2822f6a1909746ef12472d7"} Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.039697 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fskpx\" (UniqueName: \"kubernetes.io/projected/0ab69d10-042e-422a-830e-65d3d1132197-kube-api-access-fskpx\") pod \"0ab69d10-042e-422a-830e-65d3d1132197\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.039772 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-catalog-content\") pod \"0ab69d10-042e-422a-830e-65d3d1132197\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.039809 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-utilities\") pod \"0ab69d10-042e-422a-830e-65d3d1132197\" (UID: \"0ab69d10-042e-422a-830e-65d3d1132197\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.045046 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab69d10-042e-422a-830e-65d3d1132197-kube-api-access-fskpx" (OuterVolumeSpecName: "kube-api-access-fskpx") pod "0ab69d10-042e-422a-830e-65d3d1132197" (UID: "0ab69d10-042e-422a-830e-65d3d1132197"). InnerVolumeSpecName "kube-api-access-fskpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.046807 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-utilities" (OuterVolumeSpecName: "utilities") pod "0ab69d10-042e-422a-830e-65d3d1132197" (UID: "0ab69d10-042e-422a-830e-65d3d1132197"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.056602 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.060634 4703 generic.go:334] "Generic (PLEG): container finished" podID="174b5189-1afe-40a3-813b-052dd29ad296" containerID="4c52deea3c2497e76ca0e8aeee775c9afd2392c12f3ab422874576d073fda039" exitCode=0 Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.060679 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerDied","Data":"4c52deea3c2497e76ca0e8aeee775c9afd2392c12f3ab422874576d073fda039"} Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.071480 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.073251 4703 scope.go:117] "RemoveContainer" containerID="b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.079083 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.094783 4703 scope.go:117] "RemoveContainer" containerID="bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.119007 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ab69d10-042e-422a-830e-65d3d1132197" (UID: "0ab69d10-042e-422a-830e-65d3d1132197"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.132521 4703 scope.go:117] "RemoveContainer" containerID="e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141166 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-catalog-content\") pod \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141225 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-operator-metrics\") pod \"47af2971-9c78-4d96-a31f-8b77ea0adca2\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141259 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf8nd\" (UniqueName: \"kubernetes.io/projected/174b5189-1afe-40a3-813b-052dd29ad296-kube-api-access-bf8nd\") pod \"174b5189-1afe-40a3-813b-052dd29ad296\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141274 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-catalog-content\") pod \"174b5189-1afe-40a3-813b-052dd29ad296\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141297 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-trusted-ca\") pod \"47af2971-9c78-4d96-a31f-8b77ea0adca2\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141338 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnv65\" (UniqueName: \"kubernetes.io/projected/a7eceeab-0758-4f0f-88c7-5b744fbab868-kube-api-access-rnv65\") pod \"a7eceeab-0758-4f0f-88c7-5b744fbab868\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141377 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgqhz\" (UniqueName: \"kubernetes.io/projected/75d9c9d6-9701-4627-a87b-b2ae669a0eae-kube-api-access-sgqhz\") pod \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141408 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-catalog-content\") pod \"a7eceeab-0758-4f0f-88c7-5b744fbab868\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141447 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-utilities\") pod \"174b5189-1afe-40a3-813b-052dd29ad296\" (UID: \"174b5189-1afe-40a3-813b-052dd29ad296\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141509 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlmdj\" (UniqueName: \"kubernetes.io/projected/47af2971-9c78-4d96-a31f-8b77ea0adca2-kube-api-access-zlmdj\") pod \"47af2971-9c78-4d96-a31f-8b77ea0adca2\" (UID: \"47af2971-9c78-4d96-a31f-8b77ea0adca2\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141533 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-utilities\") pod \"a7eceeab-0758-4f0f-88c7-5b744fbab868\" (UID: \"a7eceeab-0758-4f0f-88c7-5b744fbab868\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141552 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-utilities\") pod \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\" (UID: \"75d9c9d6-9701-4627-a87b-b2ae669a0eae\") " Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141756 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141768 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab69d10-042e-422a-830e-65d3d1132197-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.141777 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fskpx\" (UniqueName: \"kubernetes.io/projected/0ab69d10-042e-422a-830e-65d3d1132197-kube-api-access-fskpx\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.142111 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "47af2971-9c78-4d96-a31f-8b77ea0adca2" (UID: "47af2971-9c78-4d96-a31f-8b77ea0adca2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.142249 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-utilities" (OuterVolumeSpecName: "utilities") pod "174b5189-1afe-40a3-813b-052dd29ad296" (UID: "174b5189-1afe-40a3-813b-052dd29ad296"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.142899 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-utilities" (OuterVolumeSpecName: "utilities") pod "a7eceeab-0758-4f0f-88c7-5b744fbab868" (UID: "a7eceeab-0758-4f0f-88c7-5b744fbab868"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.145575 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-utilities" (OuterVolumeSpecName: "utilities") pod "75d9c9d6-9701-4627-a87b-b2ae669a0eae" (UID: "75d9c9d6-9701-4627-a87b-b2ae669a0eae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.147033 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d9c9d6-9701-4627-a87b-b2ae669a0eae-kube-api-access-sgqhz" (OuterVolumeSpecName: "kube-api-access-sgqhz") pod "75d9c9d6-9701-4627-a87b-b2ae669a0eae" (UID: "75d9c9d6-9701-4627-a87b-b2ae669a0eae"). InnerVolumeSpecName "kube-api-access-sgqhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.147041 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7eceeab-0758-4f0f-88c7-5b744fbab868-kube-api-access-rnv65" (OuterVolumeSpecName: "kube-api-access-rnv65") pod "a7eceeab-0758-4f0f-88c7-5b744fbab868" (UID: "a7eceeab-0758-4f0f-88c7-5b744fbab868"). InnerVolumeSpecName "kube-api-access-rnv65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.147084 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174b5189-1afe-40a3-813b-052dd29ad296-kube-api-access-bf8nd" (OuterVolumeSpecName: "kube-api-access-bf8nd") pod "174b5189-1afe-40a3-813b-052dd29ad296" (UID: "174b5189-1afe-40a3-813b-052dd29ad296"). InnerVolumeSpecName "kube-api-access-bf8nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.147557 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "47af2971-9c78-4d96-a31f-8b77ea0adca2" (UID: "47af2971-9c78-4d96-a31f-8b77ea0adca2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.148402 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47af2971-9c78-4d96-a31f-8b77ea0adca2-kube-api-access-zlmdj" (OuterVolumeSpecName: "kube-api-access-zlmdj") pod "47af2971-9c78-4d96-a31f-8b77ea0adca2" (UID: "47af2971-9c78-4d96-a31f-8b77ea0adca2"). InnerVolumeSpecName "kube-api-access-zlmdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.150286 4703 scope.go:117] "RemoveContainer" containerID="b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108" Mar 09 13:27:34 crc kubenswrapper[4703]: E0309 13:27:34.153973 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108\": container with ID starting with b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108 not found: ID does not exist" containerID="b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.154010 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108"} err="failed to get container status \"b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108\": rpc error: code = NotFound desc = could not find container \"b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108\": container with ID starting with b6bb7f2fe6b79256167e3abbb980a757cf8b978503a5884a36cfc1c109315108 not found: ID does not exist" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.154038 4703 scope.go:117] "RemoveContainer" containerID="bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738" Mar 09 13:27:34 crc kubenswrapper[4703]: E0309 13:27:34.155019 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738\": container with ID starting with bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738 not found: ID does not exist" containerID="bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.155056 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738"} err="failed to get container status \"bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738\": rpc error: code = NotFound desc = could not find container \"bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738\": container with ID starting with bb54e8aca747de72ce66c6c740a8a507de03e0d0cf52211d06d9cb80d9390738 not found: ID does not exist" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.155085 4703 scope.go:117] "RemoveContainer" containerID="e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e" Mar 09 13:27:34 crc kubenswrapper[4703]: E0309 13:27:34.155615 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e\": container with ID starting with e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e not found: ID does not exist" containerID="e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.155641 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e"} err="failed to get container status \"e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e\": rpc error: code = NotFound desc = could not find container \"e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e\": container with ID starting with e9e223abb9217414105f77ca1579fa77fc0de1440d588f4619728b1d3b6a839e not found: ID does not exist" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.155658 4703 scope.go:117] "RemoveContainer" containerID="4c52deea3c2497e76ca0e8aeee775c9afd2392c12f3ab422874576d073fda039" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.176331 4703 scope.go:117] "RemoveContainer" containerID="f419bb8679aebc16727a480ccee52b111fcc5fd59cb0d7aeb8bbe7a1281ecd6a" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.176623 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7eceeab-0758-4f0f-88c7-5b744fbab868" (UID: "a7eceeab-0758-4f0f-88c7-5b744fbab868"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.199291 4703 scope.go:117] "RemoveContainer" containerID="0cd02cd3e237805eda7959fefad7349da3574c1f5972d6c49c39f9d096f6baab" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.208445 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "174b5189-1afe-40a3-813b-052dd29ad296" (UID: "174b5189-1afe-40a3-813b-052dd29ad296"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.242926 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlmdj\" (UniqueName: \"kubernetes.io/projected/47af2971-9c78-4d96-a31f-8b77ea0adca2-kube-api-access-zlmdj\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.242957 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.242968 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.242976 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.242988 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf8nd\" (UniqueName: \"kubernetes.io/projected/174b5189-1afe-40a3-813b-052dd29ad296-kube-api-access-bf8nd\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.242996 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.243005 4703 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47af2971-9c78-4d96-a31f-8b77ea0adca2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.243014 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnv65\" (UniqueName: \"kubernetes.io/projected/a7eceeab-0758-4f0f-88c7-5b744fbab868-kube-api-access-rnv65\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.243023 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgqhz\" (UniqueName: \"kubernetes.io/projected/75d9c9d6-9701-4627-a87b-b2ae669a0eae-kube-api-access-sgqhz\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.243030 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7eceeab-0758-4f0f-88c7-5b744fbab868-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.243038 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174b5189-1afe-40a3-813b-052dd29ad296-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.285803 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75d9c9d6-9701-4627-a87b-b2ae669a0eae" (UID: "75d9c9d6-9701-4627-a87b-b2ae669a0eae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.344161 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9c9d6-9701-4627-a87b-b2ae669a0eae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.370414 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zc87r"] Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.375633 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zc87r"] Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.454085 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q7bwg"] Mar 09 13:27:34 crc kubenswrapper[4703]: I0309 13:27:34.715014 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab69d10-042e-422a-830e-65d3d1132197" path="/var/lib/kubelet/pods/0ab69d10-042e-422a-830e-65d3d1132197/volumes" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.068018 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m75f5" event={"ID":"75d9c9d6-9701-4627-a87b-b2ae669a0eae","Type":"ContainerDied","Data":"696c916c5c9c469aab1f5dbcfef840ba09b69ba8b4479fad8fd37157031f6d6e"} Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.068078 4703 scope.go:117] "RemoveContainer" containerID="c1649cbb037f0ab5e9da86f0e4b7bc95cc003b46aa66f5a079246e7083ca2b70" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.068121 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m75f5" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.069796 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kvftn" event={"ID":"174b5189-1afe-40a3-813b-052dd29ad296","Type":"ContainerDied","Data":"19c102f26fa4152fccd473aef64ecc2ff89db438221496422ffbff4fa3cc56c6"} Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.069867 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kvftn" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.072765 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5rmg" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.073154 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5rmg" event={"ID":"a7eceeab-0758-4f0f-88c7-5b744fbab868","Type":"ContainerDied","Data":"83cfa815c4fc21adceb3e28d54ae3bbb23fa192e3af53bb135996f8ccd41e098"} Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.076523 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" event={"ID":"73560a56-06ae-4ac9-91b9-4fe8478082e7","Type":"ContainerStarted","Data":"94a45bc5b6c4aa65f386860a6d2d1bdbfd93240599db3488497f79694cfb81a2"} Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.076573 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" event={"ID":"73560a56-06ae-4ac9-91b9-4fe8478082e7","Type":"ContainerStarted","Data":"52ce82efc07be2bd1babf1ac9a3cd31006485b0a79720b805629a79b928f5060"} Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.076679 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.080739 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" event={"ID":"47af2971-9c78-4d96-a31f-8b77ea0adca2","Type":"ContainerDied","Data":"2192b87eda623302c4115b7d9c8957ddd613a611eeb4a06899874132bb6c2b49"} Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.080901 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ptzrr" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.084741 4703 scope.go:117] "RemoveContainer" containerID="955e9a798cc83fcd9da3cd05e3b96611174c2cec57237cac82eda3dcaead74a6" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.085816 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.107055 4703 scope.go:117] "RemoveContainer" containerID="6a42eab526fe11f909b607ac0b4b97ff612c17e241d4f4a7b00bed5eff2595ad" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.109032 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q7bwg" podStartSLOduration=2.109004654 podStartE2EDuration="2.109004654s" podCreationTimestamp="2026-03-09 13:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:27:35.102053247 +0000 UTC m=+451.069469013" watchObservedRunningTime="2026-03-09 13:27:35.109004654 +0000 UTC m=+451.076420370" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.131790 4703 scope.go:117] "RemoveContainer" containerID="f85f27e053a633e84612bfec49a94a8ef0d8c405ff58452f07501008893cb1e3" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.148372 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5rmg"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.152967 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5rmg"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.161980 4703 scope.go:117] "RemoveContainer" containerID="ff5f099ff9aed37285569f642ce22a2bc54361f18f227bbc4682c7fd61f767ca" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.166624 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m75f5"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.177486 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m75f5"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.185195 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kvftn"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.185956 4703 scope.go:117] "RemoveContainer" containerID="0d9dd789088be34cb1755b88473928c3cb0283e63e8e106061d6a83e515f5bed" Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.188731 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kvftn"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.191592 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptzrr"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.195931 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ptzrr"] Mar 09 13:27:35 crc kubenswrapper[4703]: I0309 13:27:35.198557 4703 scope.go:117] "RemoveContainer" containerID="08702aaaa4cfd7bbe4b728b3bbdfe510536137c311c37c0b3f5c08daba806266" Mar 09 13:27:36 crc kubenswrapper[4703]: I0309 13:27:36.714502 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174b5189-1afe-40a3-813b-052dd29ad296" path="/var/lib/kubelet/pods/174b5189-1afe-40a3-813b-052dd29ad296/volumes" Mar 09 13:27:36 crc kubenswrapper[4703]: I0309 13:27:36.715591 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" path="/var/lib/kubelet/pods/47af2971-9c78-4d96-a31f-8b77ea0adca2/volumes" Mar 09 13:27:36 crc kubenswrapper[4703]: I0309 13:27:36.716176 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" path="/var/lib/kubelet/pods/75d9c9d6-9701-4627-a87b-b2ae669a0eae/volumes" Mar 09 13:27:36 crc kubenswrapper[4703]: I0309 13:27:36.717356 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" path="/var/lib/kubelet/pods/a7eceeab-0758-4f0f-88c7-5b744fbab868/volumes" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133220 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-78djc"] Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133424 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133435 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133444 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133449 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133459 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133465 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133473 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133478 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133485 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133490 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133500 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133505 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133514 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133519 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133527 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133533 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133540 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133546 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133554 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133584 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133595 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133601 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="extract-utilities" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133608 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133614 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133622 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133627 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="extract-content" Mar 09 13:27:37 crc kubenswrapper[4703]: E0309 13:27:37.133635 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133641 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133717 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d9c9d6-9701-4627-a87b-b2ae669a0eae" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133726 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133733 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="174b5189-1afe-40a3-813b-052dd29ad296" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133745 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab69d10-042e-422a-830e-65d3d1132197" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133753 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7eceeab-0758-4f0f-88c7-5b744fbab868" containerName="registry-server" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.133924 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="47af2971-9c78-4d96-a31f-8b77ea0adca2" containerName="marketplace-operator" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.134479 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.137828 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.140081 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-78djc"] Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.186879 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6cc879a-e440-4851-b7ef-d3894d17bd60-utilities\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.187026 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzg4\" (UniqueName: \"kubernetes.io/projected/a6cc879a-e440-4851-b7ef-d3894d17bd60-kube-api-access-wvzg4\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.187059 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6cc879a-e440-4851-b7ef-d3894d17bd60-catalog-content\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.288459 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzg4\" (UniqueName: \"kubernetes.io/projected/a6cc879a-e440-4851-b7ef-d3894d17bd60-kube-api-access-wvzg4\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.288531 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6cc879a-e440-4851-b7ef-d3894d17bd60-catalog-content\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.288574 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6cc879a-e440-4851-b7ef-d3894d17bd60-utilities\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.289090 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6cc879a-e440-4851-b7ef-d3894d17bd60-utilities\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.289113 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6cc879a-e440-4851-b7ef-d3894d17bd60-catalog-content\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.308538 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzg4\" (UniqueName: \"kubernetes.io/projected/a6cc879a-e440-4851-b7ef-d3894d17bd60-kube-api-access-wvzg4\") pod \"certified-operators-78djc\" (UID: \"a6cc879a-e440-4851-b7ef-d3894d17bd60\") " pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.330868 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8rqdr"] Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.334254 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.337050 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.342516 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rqdr"] Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.389613 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlxr9\" (UniqueName: \"kubernetes.io/projected/fbd51f58-4fd4-4d47-a229-9e94e5328d93-kube-api-access-mlxr9\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.389660 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd51f58-4fd4-4d47-a229-9e94e5328d93-utilities\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.389699 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd51f58-4fd4-4d47-a229-9e94e5328d93-catalog-content\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.455296 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.490895 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd51f58-4fd4-4d47-a229-9e94e5328d93-catalog-content\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.491049 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd51f58-4fd4-4d47-a229-9e94e5328d93-utilities\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.491100 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlxr9\" (UniqueName: \"kubernetes.io/projected/fbd51f58-4fd4-4d47-a229-9e94e5328d93-kube-api-access-mlxr9\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.491518 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbd51f58-4fd4-4d47-a229-9e94e5328d93-catalog-content\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.492032 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbd51f58-4fd4-4d47-a229-9e94e5328d93-utilities\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.512222 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlxr9\" (UniqueName: \"kubernetes.io/projected/fbd51f58-4fd4-4d47-a229-9e94e5328d93-kube-api-access-mlxr9\") pod \"community-operators-8rqdr\" (UID: \"fbd51f58-4fd4-4d47-a229-9e94e5328d93\") " pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:37 crc kubenswrapper[4703]: I0309 13:27:37.658979 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:38 crc kubenswrapper[4703]: I0309 13:27:38.587956 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rqdr"] Mar 09 13:27:38 crc kubenswrapper[4703]: W0309 13:27:38.600685 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd51f58_4fd4_4d47_a229_9e94e5328d93.slice/crio-c90ca299e31641c20899eac514522aba48ebed2d72992c1dfc2ea93b071f8cb4 WatchSource:0}: Error finding container c90ca299e31641c20899eac514522aba48ebed2d72992c1dfc2ea93b071f8cb4: Status 404 returned error can't find the container with id c90ca299e31641c20899eac514522aba48ebed2d72992c1dfc2ea93b071f8cb4 Mar 09 13:27:38 crc kubenswrapper[4703]: I0309 13:27:38.633103 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-78djc"] Mar 09 13:27:38 crc kubenswrapper[4703]: W0309 13:27:38.643810 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6cc879a_e440_4851_b7ef_d3894d17bd60.slice/crio-62c8e22e41184eb16ecb6fb44d109ce279ff51d17f4702960e4ed434871bc480 WatchSource:0}: Error finding container 62c8e22e41184eb16ecb6fb44d109ce279ff51d17f4702960e4ed434871bc480: Status 404 returned error can't find the container with id 62c8e22e41184eb16ecb6fb44d109ce279ff51d17f4702960e4ed434871bc480 Mar 09 13:27:38 crc kubenswrapper[4703]: E0309 13:27:38.966199 4703 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6cc879a_e440_4851_b7ef_d3894d17bd60.slice/crio-5cea89e4737054254c893add51c4dca6c8807794cb22e92f4d2a63466a9c9c93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6cc879a_e440_4851_b7ef_d3894d17bd60.slice/crio-conmon-5cea89e4737054254c893add51c4dca6c8807794cb22e92f4d2a63466a9c9c93.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.233909 4703 generic.go:334] "Generic (PLEG): container finished" podID="fbd51f58-4fd4-4d47-a229-9e94e5328d93" containerID="ce042c573b1fc3279179e181dbfdb2279e24df12dc85addee0cd9198387fce0c" exitCode=0 Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.234026 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rqdr" event={"ID":"fbd51f58-4fd4-4d47-a229-9e94e5328d93","Type":"ContainerDied","Data":"ce042c573b1fc3279179e181dbfdb2279e24df12dc85addee0cd9198387fce0c"} Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.234077 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rqdr" event={"ID":"fbd51f58-4fd4-4d47-a229-9e94e5328d93","Type":"ContainerStarted","Data":"c90ca299e31641c20899eac514522aba48ebed2d72992c1dfc2ea93b071f8cb4"} Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.237321 4703 generic.go:334] "Generic (PLEG): container finished" podID="a6cc879a-e440-4851-b7ef-d3894d17bd60" containerID="5cea89e4737054254c893add51c4dca6c8807794cb22e92f4d2a63466a9c9c93" exitCode=0 Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.237380 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78djc" event={"ID":"a6cc879a-e440-4851-b7ef-d3894d17bd60","Type":"ContainerDied","Data":"5cea89e4737054254c893add51c4dca6c8807794cb22e92f4d2a63466a9c9c93"} Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.237422 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78djc" event={"ID":"a6cc879a-e440-4851-b7ef-d3894d17bd60","Type":"ContainerStarted","Data":"62c8e22e41184eb16ecb6fb44d109ce279ff51d17f4702960e4ed434871bc480"} Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.500456 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.500526 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.541489 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fg2b7"] Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.542384 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.546710 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.558406 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fg2b7"] Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.618605 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49tl\" (UniqueName: \"kubernetes.io/projected/8137ddd4-7854-4bd6-b93c-957922a36200-kube-api-access-k49tl\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.618671 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8137ddd4-7854-4bd6-b93c-957922a36200-utilities\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.618741 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8137ddd4-7854-4bd6-b93c-957922a36200-catalog-content\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.719931 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8137ddd4-7854-4bd6-b93c-957922a36200-utilities\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.720362 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8137ddd4-7854-4bd6-b93c-957922a36200-utilities\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.720608 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8137ddd4-7854-4bd6-b93c-957922a36200-catalog-content\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.720665 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8137ddd4-7854-4bd6-b93c-957922a36200-catalog-content\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.721231 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49tl\" (UniqueName: \"kubernetes.io/projected/8137ddd4-7854-4bd6-b93c-957922a36200-kube-api-access-k49tl\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.733471 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-685h8"] Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.734616 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.737750 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.754768 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49tl\" (UniqueName: \"kubernetes.io/projected/8137ddd4-7854-4bd6-b93c-957922a36200-kube-api-access-k49tl\") pod \"redhat-marketplace-fg2b7\" (UID: \"8137ddd4-7854-4bd6-b93c-957922a36200\") " pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.755557 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-685h8"] Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.822617 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9923728b-2d17-45a1-80a3-279456217b0f-catalog-content\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.822681 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfbq\" (UniqueName: \"kubernetes.io/projected/9923728b-2d17-45a1-80a3-279456217b0f-kube-api-access-mhfbq\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.822860 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9923728b-2d17-45a1-80a3-279456217b0f-utilities\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.880701 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.924239 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9923728b-2d17-45a1-80a3-279456217b0f-catalog-content\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.924295 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfbq\" (UniqueName: \"kubernetes.io/projected/9923728b-2d17-45a1-80a3-279456217b0f-kube-api-access-mhfbq\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.924374 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9923728b-2d17-45a1-80a3-279456217b0f-utilities\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.924930 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9923728b-2d17-45a1-80a3-279456217b0f-utilities\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.927139 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9923728b-2d17-45a1-80a3-279456217b0f-catalog-content\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:39 crc kubenswrapper[4703]: I0309 13:27:39.946516 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfbq\" (UniqueName: \"kubernetes.io/projected/9923728b-2d17-45a1-80a3-279456217b0f-kube-api-access-mhfbq\") pod \"redhat-operators-685h8\" (UID: \"9923728b-2d17-45a1-80a3-279456217b0f\") " pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:40 crc kubenswrapper[4703]: I0309 13:27:40.096246 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:40 crc kubenswrapper[4703]: I0309 13:27:40.106245 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fg2b7"] Mar 09 13:27:40 crc kubenswrapper[4703]: W0309 13:27:40.162958 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8137ddd4_7854_4bd6_b93c_957922a36200.slice/crio-aaeed92dac7c311fcd9100ca148342e5e6c4b7d3f058e07151b378517c90714e WatchSource:0}: Error finding container aaeed92dac7c311fcd9100ca148342e5e6c4b7d3f058e07151b378517c90714e: Status 404 returned error can't find the container with id aaeed92dac7c311fcd9100ca148342e5e6c4b7d3f058e07151b378517c90714e Mar 09 13:27:40 crc kubenswrapper[4703]: I0309 13:27:40.272234 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rqdr" event={"ID":"fbd51f58-4fd4-4d47-a229-9e94e5328d93","Type":"ContainerStarted","Data":"58bed576761b66eee06b801786c59a7808c199a21497dcbe25fbb14f60529c81"} Mar 09 13:27:40 crc kubenswrapper[4703]: I0309 13:27:40.275587 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg2b7" event={"ID":"8137ddd4-7854-4bd6-b93c-957922a36200","Type":"ContainerStarted","Data":"aaeed92dac7c311fcd9100ca148342e5e6c4b7d3f058e07151b378517c90714e"} Mar 09 13:27:40 crc kubenswrapper[4703]: I0309 13:27:40.277910 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78djc" event={"ID":"a6cc879a-e440-4851-b7ef-d3894d17bd60","Type":"ContainerStarted","Data":"0f5934d7302ae61d1b73a82405e01de75dbae52a97f0e557d9e7a7004b27c74a"} Mar 09 13:27:40 crc kubenswrapper[4703]: I0309 13:27:40.398128 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-685h8"] Mar 09 13:27:40 crc kubenswrapper[4703]: W0309 13:27:40.529881 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9923728b_2d17_45a1_80a3_279456217b0f.slice/crio-35bb629afce7e5b41a3c311aad756f1b24dd39830d1dad480ea611be85e1b1be WatchSource:0}: Error finding container 35bb629afce7e5b41a3c311aad756f1b24dd39830d1dad480ea611be85e1b1be: Status 404 returned error can't find the container with id 35bb629afce7e5b41a3c311aad756f1b24dd39830d1dad480ea611be85e1b1be Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.293489 4703 generic.go:334] "Generic (PLEG): container finished" podID="fbd51f58-4fd4-4d47-a229-9e94e5328d93" containerID="58bed576761b66eee06b801786c59a7808c199a21497dcbe25fbb14f60529c81" exitCode=0 Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.293575 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rqdr" event={"ID":"fbd51f58-4fd4-4d47-a229-9e94e5328d93","Type":"ContainerDied","Data":"58bed576761b66eee06b801786c59a7808c199a21497dcbe25fbb14f60529c81"} Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.298251 4703 generic.go:334] "Generic (PLEG): container finished" podID="8137ddd4-7854-4bd6-b93c-957922a36200" containerID="e7a21a0b63930c415aefe61b2fa617271a363c11d1adc24ae0fd9b9c108e7d0a" exitCode=0 Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.298363 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg2b7" event={"ID":"8137ddd4-7854-4bd6-b93c-957922a36200","Type":"ContainerDied","Data":"e7a21a0b63930c415aefe61b2fa617271a363c11d1adc24ae0fd9b9c108e7d0a"} Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.308911 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-685h8" event={"ID":"9923728b-2d17-45a1-80a3-279456217b0f","Type":"ContainerDied","Data":"fec7bf52eba6ec1c7e4067834d661e80ab5380d03cf776af8bb8b168acea06ea"} Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.308992 4703 generic.go:334] "Generic (PLEG): container finished" podID="9923728b-2d17-45a1-80a3-279456217b0f" containerID="fec7bf52eba6ec1c7e4067834d661e80ab5380d03cf776af8bb8b168acea06ea" exitCode=0 Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.309130 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-685h8" event={"ID":"9923728b-2d17-45a1-80a3-279456217b0f","Type":"ContainerStarted","Data":"35bb629afce7e5b41a3c311aad756f1b24dd39830d1dad480ea611be85e1b1be"} Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.317634 4703 generic.go:334] "Generic (PLEG): container finished" podID="a6cc879a-e440-4851-b7ef-d3894d17bd60" containerID="0f5934d7302ae61d1b73a82405e01de75dbae52a97f0e557d9e7a7004b27c74a" exitCode=0 Mar 09 13:27:41 crc kubenswrapper[4703]: I0309 13:27:41.317661 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78djc" event={"ID":"a6cc879a-e440-4851-b7ef-d3894d17bd60","Type":"ContainerDied","Data":"0f5934d7302ae61d1b73a82405e01de75dbae52a97f0e557d9e7a7004b27c74a"} Mar 09 13:27:42 crc kubenswrapper[4703]: I0309 13:27:42.324289 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-685h8" event={"ID":"9923728b-2d17-45a1-80a3-279456217b0f","Type":"ContainerStarted","Data":"409a867945e168a53f9b120d2997a24212c845f591d5638469b3843c80ed80a3"} Mar 09 13:27:42 crc kubenswrapper[4703]: I0309 13:27:42.327681 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-78djc" event={"ID":"a6cc879a-e440-4851-b7ef-d3894d17bd60","Type":"ContainerStarted","Data":"efafc0cffa61bcaed01a80af7203a03543e5894da8f85c07361521dad36b4bc3"} Mar 09 13:27:42 crc kubenswrapper[4703]: I0309 13:27:42.329963 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rqdr" event={"ID":"fbd51f58-4fd4-4d47-a229-9e94e5328d93","Type":"ContainerStarted","Data":"5f7cb8337859c41470f22e667b6825ee3bfbee9c68cbdbfac466a567ede6d997"} Mar 09 13:27:42 crc kubenswrapper[4703]: I0309 13:27:42.372789 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8rqdr" podStartSLOduration=2.854047431 podStartE2EDuration="5.372769415s" podCreationTimestamp="2026-03-09 13:27:37 +0000 UTC" firstStartedPulling="2026-03-09 13:27:39.236320679 +0000 UTC m=+455.203736395" lastFinishedPulling="2026-03-09 13:27:41.755042673 +0000 UTC m=+457.722458379" observedRunningTime="2026-03-09 13:27:42.371192818 +0000 UTC m=+458.338608504" watchObservedRunningTime="2026-03-09 13:27:42.372769415 +0000 UTC m=+458.340185111" Mar 09 13:27:42 crc kubenswrapper[4703]: I0309 13:27:42.396539 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-78djc" podStartSLOduration=2.866585802 podStartE2EDuration="5.39652219s" podCreationTimestamp="2026-03-09 13:27:37 +0000 UTC" firstStartedPulling="2026-03-09 13:27:39.240000658 +0000 UTC m=+455.207416384" lastFinishedPulling="2026-03-09 13:27:41.769937086 +0000 UTC m=+457.737352772" observedRunningTime="2026-03-09 13:27:42.392196702 +0000 UTC m=+458.359612428" watchObservedRunningTime="2026-03-09 13:27:42.39652219 +0000 UTC m=+458.363937876" Mar 09 13:27:43 crc kubenswrapper[4703]: I0309 13:27:43.337694 4703 generic.go:334] "Generic (PLEG): container finished" podID="8137ddd4-7854-4bd6-b93c-957922a36200" containerID="228033242b99e9c4334d0c0803af392eb3e35ae232c42cd1914a017107f0d622" exitCode=0 Mar 09 13:27:43 crc kubenswrapper[4703]: I0309 13:27:43.337813 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg2b7" event={"ID":"8137ddd4-7854-4bd6-b93c-957922a36200","Type":"ContainerDied","Data":"228033242b99e9c4334d0c0803af392eb3e35ae232c42cd1914a017107f0d622"} Mar 09 13:27:43 crc kubenswrapper[4703]: I0309 13:27:43.345655 4703 generic.go:334] "Generic (PLEG): container finished" podID="9923728b-2d17-45a1-80a3-279456217b0f" containerID="409a867945e168a53f9b120d2997a24212c845f591d5638469b3843c80ed80a3" exitCode=0 Mar 09 13:27:43 crc kubenswrapper[4703]: I0309 13:27:43.345698 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-685h8" event={"ID":"9923728b-2d17-45a1-80a3-279456217b0f","Type":"ContainerDied","Data":"409a867945e168a53f9b120d2997a24212c845f591d5638469b3843c80ed80a3"} Mar 09 13:27:43 crc kubenswrapper[4703]: I0309 13:27:43.891752 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mgmz6" Mar 09 13:27:43 crc kubenswrapper[4703]: I0309 13:27:43.947821 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lddp9"] Mar 09 13:27:44 crc kubenswrapper[4703]: I0309 13:27:44.353581 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg2b7" event={"ID":"8137ddd4-7854-4bd6-b93c-957922a36200","Type":"ContainerStarted","Data":"fe437183588b8f6f6894cf28912522fc383f7d4df15d530f5c23de371e586f11"} Mar 09 13:27:44 crc kubenswrapper[4703]: I0309 13:27:44.356286 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-685h8" event={"ID":"9923728b-2d17-45a1-80a3-279456217b0f","Type":"ContainerStarted","Data":"0438c8ec540028108cdcf774f3e16b5b5ba636b08406a4ecbc68120a634c778a"} Mar 09 13:27:44 crc kubenswrapper[4703]: I0309 13:27:44.371456 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fg2b7" podStartSLOduration=2.959968727 podStartE2EDuration="5.371437657s" podCreationTimestamp="2026-03-09 13:27:39 +0000 UTC" firstStartedPulling="2026-03-09 13:27:41.301199288 +0000 UTC m=+457.268614974" lastFinishedPulling="2026-03-09 13:27:43.712668218 +0000 UTC m=+459.680083904" observedRunningTime="2026-03-09 13:27:44.369076897 +0000 UTC m=+460.336492613" watchObservedRunningTime="2026-03-09 13:27:44.371437657 +0000 UTC m=+460.338853343" Mar 09 13:27:44 crc kubenswrapper[4703]: I0309 13:27:44.385339 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-685h8" podStartSLOduration=2.974078877 podStartE2EDuration="5.38531898s" podCreationTimestamp="2026-03-09 13:27:39 +0000 UTC" firstStartedPulling="2026-03-09 13:27:41.310225296 +0000 UTC m=+457.277641012" lastFinishedPulling="2026-03-09 13:27:43.721465429 +0000 UTC m=+459.688881115" observedRunningTime="2026-03-09 13:27:44.384529766 +0000 UTC m=+460.351945462" watchObservedRunningTime="2026-03-09 13:27:44.38531898 +0000 UTC m=+460.352734666" Mar 09 13:27:47 crc kubenswrapper[4703]: I0309 13:27:47.457135 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:47 crc kubenswrapper[4703]: I0309 13:27:47.457588 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:47 crc kubenswrapper[4703]: I0309 13:27:47.508653 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:47 crc kubenswrapper[4703]: I0309 13:27:47.659606 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:47 crc kubenswrapper[4703]: I0309 13:27:47.659666 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:47 crc kubenswrapper[4703]: I0309 13:27:47.693867 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:48 crc kubenswrapper[4703]: I0309 13:27:48.422345 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8rqdr" Mar 09 13:27:48 crc kubenswrapper[4703]: I0309 13:27:48.429291 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-78djc" Mar 09 13:27:50 crc kubenswrapper[4703]: I0309 13:27:49.880890 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:50 crc kubenswrapper[4703]: I0309 13:27:49.880951 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:50 crc kubenswrapper[4703]: I0309 13:27:49.922888 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:50 crc kubenswrapper[4703]: I0309 13:27:50.097750 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:50 crc kubenswrapper[4703]: I0309 13:27:50.097902 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:27:50 crc kubenswrapper[4703]: I0309 13:27:50.423774 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fg2b7" Mar 09 13:27:51 crc kubenswrapper[4703]: I0309 13:27:51.138406 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-685h8" podUID="9923728b-2d17-45a1-80a3-279456217b0f" containerName="registry-server" probeResult="failure" output=< Mar 09 13:27:51 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:27:51 crc kubenswrapper[4703]: > Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.136879 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551048-2kpzv"] Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.138299 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.140436 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.140642 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.140642 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.145321 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-2kpzv"] Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.159265 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.213598 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-685h8" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.279761 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwtfs\" (UniqueName: \"kubernetes.io/projected/7a5b1542-68c7-406f-8350-a4c25bf83ea1-kube-api-access-cwtfs\") pod \"auto-csr-approver-29551048-2kpzv\" (UID: \"7a5b1542-68c7-406f-8350-a4c25bf83ea1\") " pod="openshift-infra/auto-csr-approver-29551048-2kpzv" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.381568 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwtfs\" (UniqueName: \"kubernetes.io/projected/7a5b1542-68c7-406f-8350-a4c25bf83ea1-kube-api-access-cwtfs\") pod \"auto-csr-approver-29551048-2kpzv\" (UID: \"7a5b1542-68c7-406f-8350-a4c25bf83ea1\") " pod="openshift-infra/auto-csr-approver-29551048-2kpzv" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.406770 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwtfs\" (UniqueName: \"kubernetes.io/projected/7a5b1542-68c7-406f-8350-a4c25bf83ea1-kube-api-access-cwtfs\") pod \"auto-csr-approver-29551048-2kpzv\" (UID: \"7a5b1542-68c7-406f-8350-a4c25bf83ea1\") " pod="openshift-infra/auto-csr-approver-29551048-2kpzv" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.454012 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" Mar 09 13:28:00 crc kubenswrapper[4703]: I0309 13:28:00.855823 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-2kpzv"] Mar 09 13:28:00 crc kubenswrapper[4703]: W0309 13:28:00.869360 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a5b1542_68c7_406f_8350_a4c25bf83ea1.slice/crio-924a1a314bbcd0c9039493e4e995a2c515db33944e8c72cff0eaba69f7791fed WatchSource:0}: Error finding container 924a1a314bbcd0c9039493e4e995a2c515db33944e8c72cff0eaba69f7791fed: Status 404 returned error can't find the container with id 924a1a314bbcd0c9039493e4e995a2c515db33944e8c72cff0eaba69f7791fed Mar 09 13:28:01 crc kubenswrapper[4703]: I0309 13:28:01.455004 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" event={"ID":"7a5b1542-68c7-406f-8350-a4c25bf83ea1","Type":"ContainerStarted","Data":"924a1a314bbcd0c9039493e4e995a2c515db33944e8c72cff0eaba69f7791fed"} Mar 09 13:28:02 crc kubenswrapper[4703]: I0309 13:28:02.463390 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" event={"ID":"7a5b1542-68c7-406f-8350-a4c25bf83ea1","Type":"ContainerStarted","Data":"c2d7e62e3cd6c1d43bd865acf82212b18828f4ae68f49464aa1a41c4ae9950e7"} Mar 09 13:28:02 crc kubenswrapper[4703]: I0309 13:28:02.482362 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" podStartSLOduration=1.300283228 podStartE2EDuration="2.482346136s" podCreationTimestamp="2026-03-09 13:28:00 +0000 UTC" firstStartedPulling="2026-03-09 13:28:00.871095395 +0000 UTC m=+476.838511081" lastFinishedPulling="2026-03-09 13:28:02.053158303 +0000 UTC m=+478.020573989" observedRunningTime="2026-03-09 13:28:02.480073748 +0000 UTC m=+478.447489434" watchObservedRunningTime="2026-03-09 13:28:02.482346136 +0000 UTC m=+478.449761832" Mar 09 13:28:03 crc kubenswrapper[4703]: I0309 13:28:03.469408 4703 generic.go:334] "Generic (PLEG): container finished" podID="7a5b1542-68c7-406f-8350-a4c25bf83ea1" containerID="c2d7e62e3cd6c1d43bd865acf82212b18828f4ae68f49464aa1a41c4ae9950e7" exitCode=0 Mar 09 13:28:03 crc kubenswrapper[4703]: I0309 13:28:03.469455 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" event={"ID":"7a5b1542-68c7-406f-8350-a4c25bf83ea1","Type":"ContainerDied","Data":"c2d7e62e3cd6c1d43bd865acf82212b18828f4ae68f49464aa1a41c4ae9950e7"} Mar 09 13:28:04 crc kubenswrapper[4703]: I0309 13:28:04.717744 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" Mar 09 13:28:04 crc kubenswrapper[4703]: I0309 13:28:04.845250 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwtfs\" (UniqueName: \"kubernetes.io/projected/7a5b1542-68c7-406f-8350-a4c25bf83ea1-kube-api-access-cwtfs\") pod \"7a5b1542-68c7-406f-8350-a4c25bf83ea1\" (UID: \"7a5b1542-68c7-406f-8350-a4c25bf83ea1\") " Mar 09 13:28:04 crc kubenswrapper[4703]: I0309 13:28:04.857856 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5b1542-68c7-406f-8350-a4c25bf83ea1-kube-api-access-cwtfs" (OuterVolumeSpecName: "kube-api-access-cwtfs") pod "7a5b1542-68c7-406f-8350-a4c25bf83ea1" (UID: "7a5b1542-68c7-406f-8350-a4c25bf83ea1"). InnerVolumeSpecName "kube-api-access-cwtfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:04 crc kubenswrapper[4703]: I0309 13:28:04.946415 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwtfs\" (UniqueName: \"kubernetes.io/projected/7a5b1542-68c7-406f-8350-a4c25bf83ea1-kube-api-access-cwtfs\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:05 crc kubenswrapper[4703]: I0309 13:28:05.481398 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" event={"ID":"7a5b1542-68c7-406f-8350-a4c25bf83ea1","Type":"ContainerDied","Data":"924a1a314bbcd0c9039493e4e995a2c515db33944e8c72cff0eaba69f7791fed"} Mar 09 13:28:05 crc kubenswrapper[4703]: I0309 13:28:05.482004 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924a1a314bbcd0c9039493e4e995a2c515db33944e8c72cff0eaba69f7791fed" Mar 09 13:28:05 crc kubenswrapper[4703]: I0309 13:28:05.482085 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551048-2kpzv" Mar 09 13:28:05 crc kubenswrapper[4703]: I0309 13:28:05.536533 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-fv4ms"] Mar 09 13:28:05 crc kubenswrapper[4703]: I0309 13:28:05.539317 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551042-fv4ms"] Mar 09 13:28:06 crc kubenswrapper[4703]: I0309 13:28:06.716013 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64728a68-4675-4652-a800-7f055197862b" path="/var/lib/kubelet/pods/64728a68-4675-4652-a800-7f055197862b/volumes" Mar 09 13:28:08 crc kubenswrapper[4703]: I0309 13:28:08.992533 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" podUID="cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" containerName="registry" containerID="cri-o://c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728" gracePeriod=30 Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.398825 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.500190 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.500246 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.500292 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.501016 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"811feda9599d695e42cb52b203e2807fe7b8b341b8ecccd4904c7902531d2ff1"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.501080 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://811feda9599d695e42cb52b203e2807fe7b8b341b8ecccd4904c7902531d2ff1" gracePeriod=600 Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.505764 4703 generic.go:334] "Generic (PLEG): container finished" podID="cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" containerID="c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728" exitCode=0 Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.505826 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" event={"ID":"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd","Type":"ContainerDied","Data":"c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728"} Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.505868 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" event={"ID":"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd","Type":"ContainerDied","Data":"b5a0a84afda82daca18826d95e91c3a56b8df030921d7cdfb63836f6adabc1a5"} Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.505889 4703 scope.go:117] "RemoveContainer" containerID="c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.505897 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lddp9" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.512665 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.512744 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-certificates\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.512767 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-trusted-ca\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.512825 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcjjp\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-kube-api-access-kcjjp\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.512898 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-ca-trust-extracted\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.512944 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-tls\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.512986 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-installation-pull-secrets\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.513013 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-bound-sa-token\") pod \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\" (UID: \"cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd\") " Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.513733 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.514544 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.519189 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-kube-api-access-kcjjp" (OuterVolumeSpecName: "kube-api-access-kcjjp") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "kube-api-access-kcjjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.519354 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.521905 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.522787 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.529065 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.537149 4703 scope.go:117] "RemoveContainer" containerID="c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728" Mar 09 13:28:09 crc kubenswrapper[4703]: E0309 13:28:09.537552 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728\": container with ID starting with c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728 not found: ID does not exist" containerID="c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.537593 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728"} err="failed to get container status \"c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728\": rpc error: code = NotFound desc = could not find container \"c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728\": container with ID starting with c7cb9f2516753f4a5344222bcf5cbde69d6dd97646a2910bee5eae8819f92728 not found: ID does not exist" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.542314 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" (UID: "cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.614126 4703 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.614158 4703 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.614197 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcjjp\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-kube-api-access-kcjjp\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.614211 4703 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.614224 4703 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.614234 4703 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.614243 4703 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.836169 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lddp9"] Mar 09 13:28:09 crc kubenswrapper[4703]: I0309 13:28:09.840878 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lddp9"] Mar 09 13:28:10 crc kubenswrapper[4703]: I0309 13:28:10.512937 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="811feda9599d695e42cb52b203e2807fe7b8b341b8ecccd4904c7902531d2ff1" exitCode=0 Mar 09 13:28:10 crc kubenswrapper[4703]: I0309 13:28:10.513264 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"811feda9599d695e42cb52b203e2807fe7b8b341b8ecccd4904c7902531d2ff1"} Mar 09 13:28:10 crc kubenswrapper[4703]: I0309 13:28:10.513295 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"6173caf8bb9b858c40be1c304f6c4689d83a93c7a582d8cb68f949346af67630"} Mar 09 13:28:10 crc kubenswrapper[4703]: I0309 13:28:10.513317 4703 scope.go:117] "RemoveContainer" containerID="4c9fc3d67a15d4a66d56ea7b57899de24cd14117fceeac979967d0eb1b49823f" Mar 09 13:28:10 crc kubenswrapper[4703]: I0309 13:28:10.714616 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" path="/var/lib/kubelet/pods/cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd/volumes" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.161354 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551050-w874x"] Mar 09 13:30:00 crc kubenswrapper[4703]: E0309 13:30:00.162624 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5b1542-68c7-406f-8350-a4c25bf83ea1" containerName="oc" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.162656 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5b1542-68c7-406f-8350-a4c25bf83ea1" containerName="oc" Mar 09 13:30:00 crc kubenswrapper[4703]: E0309 13:30:00.162705 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" containerName="registry" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.162722 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" containerName="registry" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.162970 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbbe85f0-658d-442a-bf5e-5ab93c6ddbdd" containerName="registry" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.163001 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5b1542-68c7-406f-8350-a4c25bf83ea1" containerName="oc" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.163903 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-w874x" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.166644 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.167137 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.167256 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.169497 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987"] Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.171389 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.173919 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.174012 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-w874x"] Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.174132 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.186516 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987"] Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.279413 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-secret-volume\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.279470 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-config-volume\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.279499 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vvl\" (UniqueName: \"kubernetes.io/projected/09b7269c-2186-4e45-bb2d-74ee25925050-kube-api-access-75vvl\") pod \"auto-csr-approver-29551050-w874x\" (UID: \"09b7269c-2186-4e45-bb2d-74ee25925050\") " pod="openshift-infra/auto-csr-approver-29551050-w874x" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.279533 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn8cz\" (UniqueName: \"kubernetes.io/projected/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-kube-api-access-cn8cz\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.381080 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-config-volume\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.381193 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75vvl\" (UniqueName: \"kubernetes.io/projected/09b7269c-2186-4e45-bb2d-74ee25925050-kube-api-access-75vvl\") pod \"auto-csr-approver-29551050-w874x\" (UID: \"09b7269c-2186-4e45-bb2d-74ee25925050\") " pod="openshift-infra/auto-csr-approver-29551050-w874x" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.381289 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn8cz\" (UniqueName: \"kubernetes.io/projected/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-kube-api-access-cn8cz\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.381397 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-secret-volume\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.382580 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-config-volume\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.391110 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-secret-volume\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.409806 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vvl\" (UniqueName: \"kubernetes.io/projected/09b7269c-2186-4e45-bb2d-74ee25925050-kube-api-access-75vvl\") pod \"auto-csr-approver-29551050-w874x\" (UID: \"09b7269c-2186-4e45-bb2d-74ee25925050\") " pod="openshift-infra/auto-csr-approver-29551050-w874x" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.411069 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn8cz\" (UniqueName: \"kubernetes.io/projected/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-kube-api-access-cn8cz\") pod \"collect-profiles-29551050-w4987\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.508077 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-w874x" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.521814 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.928088 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987"] Mar 09 13:30:00 crc kubenswrapper[4703]: W0309 13:30:00.943389 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c8f82e_86b7_41aa_aef3_9b8bd04d1b33.slice/crio-309d87491ed17e72d9f8a0084c931766de76c1d454467bc438c42f7d62d83483 WatchSource:0}: Error finding container 309d87491ed17e72d9f8a0084c931766de76c1d454467bc438c42f7d62d83483: Status 404 returned error can't find the container with id 309d87491ed17e72d9f8a0084c931766de76c1d454467bc438c42f7d62d83483 Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.981225 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-w874x"] Mar 09 13:30:00 crc kubenswrapper[4703]: W0309 13:30:00.987533 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09b7269c_2186_4e45_bb2d_74ee25925050.slice/crio-04f0ca5a6c493dc41e65a0c920c920a85b67320887f26370b53d96718f257dc4 WatchSource:0}: Error finding container 04f0ca5a6c493dc41e65a0c920c920a85b67320887f26370b53d96718f257dc4: Status 404 returned error can't find the container with id 04f0ca5a6c493dc41e65a0c920c920a85b67320887f26370b53d96718f257dc4 Mar 09 13:30:00 crc kubenswrapper[4703]: I0309 13:30:00.990793 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:30:01 crc kubenswrapper[4703]: I0309 13:30:01.212937 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" event={"ID":"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33","Type":"ContainerStarted","Data":"42d00b778102249a4fdb67cec9cb6de0859b63ef4b12e6eb847add5e9ed09b7b"} Mar 09 13:30:01 crc kubenswrapper[4703]: I0309 13:30:01.213023 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" event={"ID":"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33","Type":"ContainerStarted","Data":"309d87491ed17e72d9f8a0084c931766de76c1d454467bc438c42f7d62d83483"} Mar 09 13:30:01 crc kubenswrapper[4703]: I0309 13:30:01.214250 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-w874x" event={"ID":"09b7269c-2186-4e45-bb2d-74ee25925050","Type":"ContainerStarted","Data":"04f0ca5a6c493dc41e65a0c920c920a85b67320887f26370b53d96718f257dc4"} Mar 09 13:30:01 crc kubenswrapper[4703]: I0309 13:30:01.231104 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" podStartSLOduration=1.231080687 podStartE2EDuration="1.231080687s" podCreationTimestamp="2026-03-09 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:30:01.224886516 +0000 UTC m=+597.192302222" watchObservedRunningTime="2026-03-09 13:30:01.231080687 +0000 UTC m=+597.198496413" Mar 09 13:30:02 crc kubenswrapper[4703]: I0309 13:30:02.222013 4703 generic.go:334] "Generic (PLEG): container finished" podID="58c8f82e-86b7-41aa-aef3-9b8bd04d1b33" containerID="42d00b778102249a4fdb67cec9cb6de0859b63ef4b12e6eb847add5e9ed09b7b" exitCode=0 Mar 09 13:30:02 crc kubenswrapper[4703]: I0309 13:30:02.222072 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" event={"ID":"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33","Type":"ContainerDied","Data":"42d00b778102249a4fdb67cec9cb6de0859b63ef4b12e6eb847add5e9ed09b7b"} Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.230285 4703 generic.go:334] "Generic (PLEG): container finished" podID="09b7269c-2186-4e45-bb2d-74ee25925050" containerID="036724e1736742cdb81670ed160106d0c3f8be2570bc6b51cdbed1ae77f91a19" exitCode=0 Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.230552 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-w874x" event={"ID":"09b7269c-2186-4e45-bb2d-74ee25925050","Type":"ContainerDied","Data":"036724e1736742cdb81670ed160106d0c3f8be2570bc6b51cdbed1ae77f91a19"} Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.522165 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.626214 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-config-volume\") pod \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.626374 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-secret-volume\") pod \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.626500 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn8cz\" (UniqueName: \"kubernetes.io/projected/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-kube-api-access-cn8cz\") pod \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\" (UID: \"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33\") " Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.627143 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-config-volume" (OuterVolumeSpecName: "config-volume") pod "58c8f82e-86b7-41aa-aef3-9b8bd04d1b33" (UID: "58c8f82e-86b7-41aa-aef3-9b8bd04d1b33"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.633539 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-kube-api-access-cn8cz" (OuterVolumeSpecName: "kube-api-access-cn8cz") pod "58c8f82e-86b7-41aa-aef3-9b8bd04d1b33" (UID: "58c8f82e-86b7-41aa-aef3-9b8bd04d1b33"). InnerVolumeSpecName "kube-api-access-cn8cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.634013 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "58c8f82e-86b7-41aa-aef3-9b8bd04d1b33" (UID: "58c8f82e-86b7-41aa-aef3-9b8bd04d1b33"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.727912 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn8cz\" (UniqueName: \"kubernetes.io/projected/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-kube-api-access-cn8cz\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.727945 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:03 crc kubenswrapper[4703]: I0309 13:30:03.727958 4703 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58c8f82e-86b7-41aa-aef3-9b8bd04d1b33-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:04 crc kubenswrapper[4703]: I0309 13:30:04.240945 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" Mar 09 13:30:04 crc kubenswrapper[4703]: I0309 13:30:04.240905 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551050-w4987" event={"ID":"58c8f82e-86b7-41aa-aef3-9b8bd04d1b33","Type":"ContainerDied","Data":"309d87491ed17e72d9f8a0084c931766de76c1d454467bc438c42f7d62d83483"} Mar 09 13:30:04 crc kubenswrapper[4703]: I0309 13:30:04.241044 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="309d87491ed17e72d9f8a0084c931766de76c1d454467bc438c42f7d62d83483" Mar 09 13:30:04 crc kubenswrapper[4703]: I0309 13:30:04.531509 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-w874x" Mar 09 13:30:04 crc kubenswrapper[4703]: I0309 13:30:04.639214 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75vvl\" (UniqueName: \"kubernetes.io/projected/09b7269c-2186-4e45-bb2d-74ee25925050-kube-api-access-75vvl\") pod \"09b7269c-2186-4e45-bb2d-74ee25925050\" (UID: \"09b7269c-2186-4e45-bb2d-74ee25925050\") " Mar 09 13:30:04 crc kubenswrapper[4703]: I0309 13:30:04.646949 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09b7269c-2186-4e45-bb2d-74ee25925050-kube-api-access-75vvl" (OuterVolumeSpecName: "kube-api-access-75vvl") pod "09b7269c-2186-4e45-bb2d-74ee25925050" (UID: "09b7269c-2186-4e45-bb2d-74ee25925050"). InnerVolumeSpecName "kube-api-access-75vvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:30:04 crc kubenswrapper[4703]: I0309 13:30:04.740715 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75vvl\" (UniqueName: \"kubernetes.io/projected/09b7269c-2186-4e45-bb2d-74ee25925050-kube-api-access-75vvl\") on node \"crc\" DevicePath \"\"" Mar 09 13:30:05 crc kubenswrapper[4703]: I0309 13:30:05.249368 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551050-w874x" event={"ID":"09b7269c-2186-4e45-bb2d-74ee25925050","Type":"ContainerDied","Data":"04f0ca5a6c493dc41e65a0c920c920a85b67320887f26370b53d96718f257dc4"} Mar 09 13:30:05 crc kubenswrapper[4703]: I0309 13:30:05.249431 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04f0ca5a6c493dc41e65a0c920c920a85b67320887f26370b53d96718f257dc4" Mar 09 13:30:05 crc kubenswrapper[4703]: I0309 13:30:05.249482 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551050-w874x" Mar 09 13:30:05 crc kubenswrapper[4703]: I0309 13:30:05.598260 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-rdp77"] Mar 09 13:30:05 crc kubenswrapper[4703]: I0309 13:30:05.601706 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551044-rdp77"] Mar 09 13:30:06 crc kubenswrapper[4703]: I0309 13:30:06.713693 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5e15b8-80e8-4d9c-83eb-12dd004ca901" path="/var/lib/kubelet/pods/ba5e15b8-80e8-4d9c-83eb-12dd004ca901/volumes" Mar 09 13:30:09 crc kubenswrapper[4703]: I0309 13:30:09.500434 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:30:09 crc kubenswrapper[4703]: I0309 13:30:09.500560 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:30:39 crc kubenswrapper[4703]: I0309 13:30:39.499723 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:30:39 crc kubenswrapper[4703]: I0309 13:30:39.500431 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.499978 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.500648 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.500718 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.501622 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6173caf8bb9b858c40be1c304f6c4689d83a93c7a582d8cb68f949346af67630"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.501723 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://6173caf8bb9b858c40be1c304f6c4689d83a93c7a582d8cb68f949346af67630" gracePeriod=600 Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.756162 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="6173caf8bb9b858c40be1c304f6c4689d83a93c7a582d8cb68f949346af67630" exitCode=0 Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.756286 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"6173caf8bb9b858c40be1c304f6c4689d83a93c7a582d8cb68f949346af67630"} Mar 09 13:31:09 crc kubenswrapper[4703]: I0309 13:31:09.756583 4703 scope.go:117] "RemoveContainer" containerID="811feda9599d695e42cb52b203e2807fe7b8b341b8ecccd4904c7902531d2ff1" Mar 09 13:31:10 crc kubenswrapper[4703]: I0309 13:31:10.769072 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"9a30d8d9d6908c22df3252ca1b072edab287dd75aa2220dbdad578c0ef22ecae"} Mar 09 13:31:25 crc kubenswrapper[4703]: I0309 13:31:25.771347 4703 scope.go:117] "RemoveContainer" containerID="150e49a38897494fd79421901164b2fd14df8a16c15fa522cbe3c4c5208499c3" Mar 09 13:31:25 crc kubenswrapper[4703]: I0309 13:31:25.822997 4703 scope.go:117] "RemoveContainer" containerID="aba2307c3a2d038f289dc933c9a06d74f0d61651d04f08d6bdf587dc8d0dd961" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.142016 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551052-hc57s"] Mar 09 13:32:00 crc kubenswrapper[4703]: E0309 13:32:00.142799 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c8f82e-86b7-41aa-aef3-9b8bd04d1b33" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.142814 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c8f82e-86b7-41aa-aef3-9b8bd04d1b33" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4703]: E0309 13:32:00.142833 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09b7269c-2186-4e45-bb2d-74ee25925050" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.142841 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b7269c-2186-4e45-bb2d-74ee25925050" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.142981 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c8f82e-86b7-41aa-aef3-9b8bd04d1b33" containerName="collect-profiles" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.142998 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="09b7269c-2186-4e45-bb2d-74ee25925050" containerName="oc" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.143450 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-hc57s" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.150176 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.151048 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.151267 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.156286 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-hc57s"] Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.204085 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xrpc\" (UniqueName: \"kubernetes.io/projected/ebaa375d-a2b4-4594-a69c-2da7cab4d396-kube-api-access-9xrpc\") pod \"auto-csr-approver-29551052-hc57s\" (UID: \"ebaa375d-a2b4-4594-a69c-2da7cab4d396\") " pod="openshift-infra/auto-csr-approver-29551052-hc57s" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.305552 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xrpc\" (UniqueName: \"kubernetes.io/projected/ebaa375d-a2b4-4594-a69c-2da7cab4d396-kube-api-access-9xrpc\") pod \"auto-csr-approver-29551052-hc57s\" (UID: \"ebaa375d-a2b4-4594-a69c-2da7cab4d396\") " pod="openshift-infra/auto-csr-approver-29551052-hc57s" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.340735 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xrpc\" (UniqueName: \"kubernetes.io/projected/ebaa375d-a2b4-4594-a69c-2da7cab4d396-kube-api-access-9xrpc\") pod \"auto-csr-approver-29551052-hc57s\" (UID: \"ebaa375d-a2b4-4594-a69c-2da7cab4d396\") " pod="openshift-infra/auto-csr-approver-29551052-hc57s" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.463719 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-hc57s" Mar 09 13:32:00 crc kubenswrapper[4703]: I0309 13:32:00.751117 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-hc57s"] Mar 09 13:32:01 crc kubenswrapper[4703]: I0309 13:32:01.130266 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-hc57s" event={"ID":"ebaa375d-a2b4-4594-a69c-2da7cab4d396","Type":"ContainerStarted","Data":"519cb64615ddaf85093dff9889990025f3d349911f1a7abedc3f5fbf66c2270a"} Mar 09 13:32:02 crc kubenswrapper[4703]: I0309 13:32:02.136431 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-hc57s" event={"ID":"ebaa375d-a2b4-4594-a69c-2da7cab4d396","Type":"ContainerStarted","Data":"a4e2e00e94bf933fd38a0cfe881ee74c2acf133d5bfca58901f5f4fc771e2461"} Mar 09 13:32:02 crc kubenswrapper[4703]: I0309 13:32:02.150707 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551052-hc57s" podStartSLOduration=1.060399967 podStartE2EDuration="2.150668409s" podCreationTimestamp="2026-03-09 13:32:00 +0000 UTC" firstStartedPulling="2026-03-09 13:32:00.75367279 +0000 UTC m=+716.721088476" lastFinishedPulling="2026-03-09 13:32:01.843941202 +0000 UTC m=+717.811356918" observedRunningTime="2026-03-09 13:32:02.148630102 +0000 UTC m=+718.116045798" watchObservedRunningTime="2026-03-09 13:32:02.150668409 +0000 UTC m=+718.118084095" Mar 09 13:32:03 crc kubenswrapper[4703]: I0309 13:32:03.143507 4703 generic.go:334] "Generic (PLEG): container finished" podID="ebaa375d-a2b4-4594-a69c-2da7cab4d396" containerID="a4e2e00e94bf933fd38a0cfe881ee74c2acf133d5bfca58901f5f4fc771e2461" exitCode=0 Mar 09 13:32:03 crc kubenswrapper[4703]: I0309 13:32:03.143675 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-hc57s" event={"ID":"ebaa375d-a2b4-4594-a69c-2da7cab4d396","Type":"ContainerDied","Data":"a4e2e00e94bf933fd38a0cfe881ee74c2acf133d5bfca58901f5f4fc771e2461"} Mar 09 13:32:04 crc kubenswrapper[4703]: I0309 13:32:04.447285 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-hc57s" Mar 09 13:32:04 crc kubenswrapper[4703]: I0309 13:32:04.466307 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xrpc\" (UniqueName: \"kubernetes.io/projected/ebaa375d-a2b4-4594-a69c-2da7cab4d396-kube-api-access-9xrpc\") pod \"ebaa375d-a2b4-4594-a69c-2da7cab4d396\" (UID: \"ebaa375d-a2b4-4594-a69c-2da7cab4d396\") " Mar 09 13:32:04 crc kubenswrapper[4703]: I0309 13:32:04.475408 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebaa375d-a2b4-4594-a69c-2da7cab4d396-kube-api-access-9xrpc" (OuterVolumeSpecName: "kube-api-access-9xrpc") pod "ebaa375d-a2b4-4594-a69c-2da7cab4d396" (UID: "ebaa375d-a2b4-4594-a69c-2da7cab4d396"). InnerVolumeSpecName "kube-api-access-9xrpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:32:04 crc kubenswrapper[4703]: I0309 13:32:04.568025 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xrpc\" (UniqueName: \"kubernetes.io/projected/ebaa375d-a2b4-4594-a69c-2da7cab4d396-kube-api-access-9xrpc\") on node \"crc\" DevicePath \"\"" Mar 09 13:32:05 crc kubenswrapper[4703]: I0309 13:32:05.172472 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551052-hc57s" event={"ID":"ebaa375d-a2b4-4594-a69c-2da7cab4d396","Type":"ContainerDied","Data":"519cb64615ddaf85093dff9889990025f3d349911f1a7abedc3f5fbf66c2270a"} Mar 09 13:32:05 crc kubenswrapper[4703]: I0309 13:32:05.172546 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="519cb64615ddaf85093dff9889990025f3d349911f1a7abedc3f5fbf66c2270a" Mar 09 13:32:05 crc kubenswrapper[4703]: I0309 13:32:05.172723 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551052-hc57s" Mar 09 13:32:05 crc kubenswrapper[4703]: I0309 13:32:05.225456 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-r6sbp"] Mar 09 13:32:05 crc kubenswrapper[4703]: I0309 13:32:05.231793 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551046-r6sbp"] Mar 09 13:32:06 crc kubenswrapper[4703]: I0309 13:32:06.719014 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e289a811-bb2f-43bb-9543-5ae330b86a91" path="/var/lib/kubelet/pods/e289a811-bb2f-43bb-9543-5ae330b86a91/volumes" Mar 09 13:32:25 crc kubenswrapper[4703]: I0309 13:32:25.912003 4703 scope.go:117] "RemoveContainer" containerID="be6514ee01d072cd70cfc70248b6f6fd3d19c1b9adb5ab1428c0fae3347666f3" Mar 09 13:33:09 crc kubenswrapper[4703]: I0309 13:33:09.500317 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:33:09 crc kubenswrapper[4703]: I0309 13:33:09.501066 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.920036 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9khwq"] Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.921301 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-controller" containerID="cri-o://9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" gracePeriod=30 Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.921421 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="northd" containerID="cri-o://01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" gracePeriod=30 Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.921489 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-node" containerID="cri-o://75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" gracePeriod=30 Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.921563 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-acl-logging" containerID="cri-o://b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" gracePeriod=30 Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.921468 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" gracePeriod=30 Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.921746 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="nbdb" containerID="cri-o://19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" gracePeriod=30 Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.921781 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="sbdb" containerID="cri-o://6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" gracePeriod=30 Mar 09 13:33:29 crc kubenswrapper[4703]: I0309 13:33:29.984508 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" containerID="cri-o://0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" gracePeriod=30 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.275046 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/3.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.277565 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovn-acl-logging/0.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.278082 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovn-controller/0.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.278512 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.346658 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v7q7n"] Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.346903 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.346919 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.346932 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-acl-logging" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.346941 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-acl-logging" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.346949 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.346958 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.346970 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebaa375d-a2b4-4594-a69c-2da7cab4d396" containerName="oc" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.346977 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebaa375d-a2b4-4594-a69c-2da7cab4d396" containerName="oc" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.346990 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="nbdb" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.346997 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="nbdb" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347009 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347016 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347028 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kubecfg-setup" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347035 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kubecfg-setup" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347049 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347057 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347068 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="sbdb" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347075 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="sbdb" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347087 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347094 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347103 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-node" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347110 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-node" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347118 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="northd" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347126 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="northd" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347134 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347142 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347246 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347259 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347270 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347279 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="kube-rbac-proxy-node" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347289 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-acl-logging" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347298 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="northd" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347308 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="sbdb" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347321 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347329 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovn-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347339 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347347 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebaa375d-a2b4-4594-a69c-2da7cab4d396" containerName="oc" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347356 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="nbdb" Mar 09 13:33:30 crc kubenswrapper[4703]: E0309 13:33:30.347467 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347477 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.347607 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerName="ovnkube-controller" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.349649 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429017 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-bin\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429124 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429144 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-env-overrides\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429228 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-log-socket\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429278 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-node-log\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429350 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-log-socket" (OuterVolumeSpecName: "log-socket") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429419 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-config\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429432 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-node-log" (OuterVolumeSpecName: "node-log") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.429910 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.430130 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.430273 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-openvswitch\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.430369 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.430461 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-var-lib-openvswitch\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.430602 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.430681 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovn-node-metrics-cert\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.430931 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432157 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-netd\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432222 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432275 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-netns\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432334 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-systemd-units\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432395 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-ovn-kubernetes\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432452 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9psl\" (UniqueName: \"kubernetes.io/projected/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-kube-api-access-x9psl\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432399 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432508 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-systemd\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432392 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432562 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-ovn\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432622 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-script-lib\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432426 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432436 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432673 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-kubelet\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432722 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432676 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432754 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-slash\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432793 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-slash" (OuterVolumeSpecName: "host-slash") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.432943 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-etc-openvswitch\") pod \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\" (UID: \"650f98b2-73a9-4c73-b0cf-70d3bdd61edd\") " Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433042 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433102 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433542 4703 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433595 4703 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433623 4703 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433648 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433673 4703 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433697 4703 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433721 4703 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433749 4703 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433778 4703 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433802 4703 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433827 4703 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433885 4703 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433912 4703 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433937 4703 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433962 4703 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.433987 4703 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.434012 4703 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.437922 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-kube-api-access-x9psl" (OuterVolumeSpecName: "kube-api-access-x9psl") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "kube-api-access-x9psl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.438467 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.455674 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "650f98b2-73a9-4c73-b0cf-70d3bdd61edd" (UID: "650f98b2-73a9-4c73-b0cf-70d3bdd61edd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.535616 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.535700 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvrg9\" (UniqueName: \"kubernetes.io/projected/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-kube-api-access-wvrg9\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.535770 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovn-node-metrics-cert\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.535860 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-run-ovn-kubernetes\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.535943 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovnkube-config\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536002 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-ovn\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536052 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-cni-netd\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536093 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-systemd-units\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536122 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-node-log\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536195 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-log-socket\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536236 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-systemd\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536269 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-cni-bin\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536301 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536339 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-etc-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536374 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-slash\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536407 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-env-overrides\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536452 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-run-netns\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536490 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-var-lib-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536517 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovnkube-script-lib\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536555 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-kubelet\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536616 4703 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536637 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9psl\" (UniqueName: \"kubernetes.io/projected/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-kube-api-access-x9psl\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.536657 4703 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/650f98b2-73a9-4c73-b0cf-70d3bdd61edd-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.637954 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-run-netns\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638068 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-var-lib-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638101 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovnkube-script-lib\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638136 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-kubelet\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638178 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638208 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrg9\" (UniqueName: \"kubernetes.io/projected/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-kube-api-access-wvrg9\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638204 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-run-netns\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638229 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-var-lib-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638284 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-kubelet\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638234 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovn-node-metrics-cert\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638366 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638433 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-run-ovn-kubernetes\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638486 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovnkube-config\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638593 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-run-ovn-kubernetes\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638608 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-ovn\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638708 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-cni-netd\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638784 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-systemd-units\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638824 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-ovn\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638853 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-node-log\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638889 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-cni-netd\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638939 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-log-socket\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638982 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-node-log\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638947 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-systemd-units\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639006 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-systemd\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639044 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-run-systemd\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639055 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-cni-bin\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.638988 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-log-socket\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639101 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639123 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-cni-bin\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639164 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639173 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-etc-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639243 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-etc-openvswitch\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639267 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-slash\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639315 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-host-slash\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639318 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-env-overrides\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.639672 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovnkube-script-lib\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.640033 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovnkube-config\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.640224 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-env-overrides\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.644427 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-ovn-node-metrics-cert\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.672944 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrg9\" (UniqueName: \"kubernetes.io/projected/7d4d2124-dfc3-4e4c-be3e-c523d66ed683-kube-api-access-wvrg9\") pod \"ovnkube-node-v7q7n\" (UID: \"7d4d2124-dfc3-4e4c-be3e-c523d66ed683\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.780263 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/2.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.781086 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/1.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.781165 4703 generic.go:334] "Generic (PLEG): container finished" podID="d59f2278-9dbc-48bb-8d56-fa9da4183118" containerID="668964c01b4d2908da9803589c5ad4a6e403b5be0fd29ed46afa2fba2d4dd26c" exitCode=2 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.781264 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerDied","Data":"668964c01b4d2908da9803589c5ad4a6e403b5be0fd29ed46afa2fba2d4dd26c"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.781316 4703 scope.go:117] "RemoveContainer" containerID="d0ebf3127b5bf77a8c7241f6c6ee6e7507fcdf3cb9e424870b6c331bc5a40b3f" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.782125 4703 scope.go:117] "RemoveContainer" containerID="668964c01b4d2908da9803589c5ad4a6e403b5be0fd29ed46afa2fba2d4dd26c" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.785215 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovnkube-controller/3.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.790375 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovn-acl-logging/0.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.791511 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9khwq_650f98b2-73a9-4c73-b0cf-70d3bdd61edd/ovn-controller/0.log" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793460 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" exitCode=0 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793494 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" exitCode=0 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793512 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" exitCode=0 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793526 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" exitCode=0 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793539 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" exitCode=0 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793550 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" exitCode=0 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793560 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" exitCode=143 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793572 4703 generic.go:334] "Generic (PLEG): container finished" podID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" containerID="9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" exitCode=143 Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793601 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793683 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793722 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793751 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793778 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793805 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793833 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793896 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793915 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793929 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793946 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793960 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793975 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.793989 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794003 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794019 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794042 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794071 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794090 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794104 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794185 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794353 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794377 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794445 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794458 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794468 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794480 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794498 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794674 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794697 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794709 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794719 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794729 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794754 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794766 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794776 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794786 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794797 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794820 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" event={"ID":"650f98b2-73a9-4c73-b0cf-70d3bdd61edd","Type":"ContainerDied","Data":"b3d382ee9222a9fc707c99ad08b0d8e03b192284f998f3c8b8d69be6b1142445"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794883 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794903 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794918 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794931 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794943 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794953 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794963 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794974 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794984 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.794995 4703 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.797036 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9khwq" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.816908 4703 scope.go:117] "RemoveContainer" containerID="0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.836632 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9khwq"] Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.841333 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.845154 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9khwq"] Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.868239 4703 scope.go:117] "RemoveContainer" containerID="6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.898273 4703 scope.go:117] "RemoveContainer" containerID="19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.917012 4703 scope.go:117] "RemoveContainer" containerID="01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.940118 4703 scope.go:117] "RemoveContainer" containerID="3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.959683 4703 scope.go:117] "RemoveContainer" containerID="75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.964321 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.978462 4703 scope.go:117] "RemoveContainer" containerID="b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" Mar 09 13:33:30 crc kubenswrapper[4703]: I0309 13:33:30.993197 4703 scope.go:117] "RemoveContainer" containerID="9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.024406 4703 scope.go:117] "RemoveContainer" containerID="1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.046462 4703 scope.go:117] "RemoveContainer" containerID="0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.046903 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": container with ID starting with 0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78 not found: ID does not exist" containerID="0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.046978 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} err="failed to get container status \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": rpc error: code = NotFound desc = could not find container \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": container with ID starting with 0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.047020 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.047374 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": container with ID starting with eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7 not found: ID does not exist" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.047425 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} err="failed to get container status \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": rpc error: code = NotFound desc = could not find container \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": container with ID starting with eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.047462 4703 scope.go:117] "RemoveContainer" containerID="6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.047959 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": container with ID starting with 6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255 not found: ID does not exist" containerID="6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.047992 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} err="failed to get container status \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": rpc error: code = NotFound desc = could not find container \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": container with ID starting with 6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.048008 4703 scope.go:117] "RemoveContainer" containerID="19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.048730 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": container with ID starting with 19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd not found: ID does not exist" containerID="19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.048789 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} err="failed to get container status \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": rpc error: code = NotFound desc = could not find container \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": container with ID starting with 19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.048818 4703 scope.go:117] "RemoveContainer" containerID="01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.049464 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": container with ID starting with 01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9 not found: ID does not exist" containerID="01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.049512 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} err="failed to get container status \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": rpc error: code = NotFound desc = could not find container \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": container with ID starting with 01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.049539 4703 scope.go:117] "RemoveContainer" containerID="3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.049984 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": container with ID starting with 3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74 not found: ID does not exist" containerID="3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.050044 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} err="failed to get container status \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": rpc error: code = NotFound desc = could not find container \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": container with ID starting with 3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.050082 4703 scope.go:117] "RemoveContainer" containerID="75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.050467 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": container with ID starting with 75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c not found: ID does not exist" containerID="75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.050523 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} err="failed to get container status \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": rpc error: code = NotFound desc = could not find container \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": container with ID starting with 75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.050559 4703 scope.go:117] "RemoveContainer" containerID="b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.051286 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": container with ID starting with b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b not found: ID does not exist" containerID="b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.051346 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} err="failed to get container status \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": rpc error: code = NotFound desc = could not find container \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": container with ID starting with b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.051511 4703 scope.go:117] "RemoveContainer" containerID="9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.051987 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": container with ID starting with 9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14 not found: ID does not exist" containerID="9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.052055 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} err="failed to get container status \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": rpc error: code = NotFound desc = could not find container \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": container with ID starting with 9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.052095 4703 scope.go:117] "RemoveContainer" containerID="1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6" Mar 09 13:33:31 crc kubenswrapper[4703]: E0309 13:33:31.052538 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": container with ID starting with 1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6 not found: ID does not exist" containerID="1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.052577 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} err="failed to get container status \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": rpc error: code = NotFound desc = could not find container \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": container with ID starting with 1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.052606 4703 scope.go:117] "RemoveContainer" containerID="0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.053642 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} err="failed to get container status \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": rpc error: code = NotFound desc = could not find container \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": container with ID starting with 0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.053684 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.054136 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} err="failed to get container status \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": rpc error: code = NotFound desc = could not find container \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": container with ID starting with eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.054181 4703 scope.go:117] "RemoveContainer" containerID="6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.054518 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} err="failed to get container status \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": rpc error: code = NotFound desc = could not find container \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": container with ID starting with 6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.054561 4703 scope.go:117] "RemoveContainer" containerID="19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.054961 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} err="failed to get container status \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": rpc error: code = NotFound desc = could not find container \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": container with ID starting with 19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.055004 4703 scope.go:117] "RemoveContainer" containerID="01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.055456 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} err="failed to get container status \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": rpc error: code = NotFound desc = could not find container \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": container with ID starting with 01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.055497 4703 scope.go:117] "RemoveContainer" containerID="3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.056102 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} err="failed to get container status \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": rpc error: code = NotFound desc = could not find container \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": container with ID starting with 3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.056141 4703 scope.go:117] "RemoveContainer" containerID="75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.056471 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} err="failed to get container status \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": rpc error: code = NotFound desc = could not find container \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": container with ID starting with 75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.056511 4703 scope.go:117] "RemoveContainer" containerID="b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.056965 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} err="failed to get container status \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": rpc error: code = NotFound desc = could not find container \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": container with ID starting with b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.057001 4703 scope.go:117] "RemoveContainer" containerID="9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.057393 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} err="failed to get container status \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": rpc error: code = NotFound desc = could not find container \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": container with ID starting with 9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.057432 4703 scope.go:117] "RemoveContainer" containerID="1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.057769 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} err="failed to get container status \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": rpc error: code = NotFound desc = could not find container \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": container with ID starting with 1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.057812 4703 scope.go:117] "RemoveContainer" containerID="0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.058694 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} err="failed to get container status \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": rpc error: code = NotFound desc = could not find container \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": container with ID starting with 0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.058728 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.059249 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} err="failed to get container status \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": rpc error: code = NotFound desc = could not find container \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": container with ID starting with eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.059293 4703 scope.go:117] "RemoveContainer" containerID="6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.059723 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} err="failed to get container status \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": rpc error: code = NotFound desc = could not find container \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": container with ID starting with 6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.059797 4703 scope.go:117] "RemoveContainer" containerID="19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.060332 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} err="failed to get container status \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": rpc error: code = NotFound desc = could not find container \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": container with ID starting with 19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.060375 4703 scope.go:117] "RemoveContainer" containerID="01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.060800 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} err="failed to get container status \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": rpc error: code = NotFound desc = could not find container \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": container with ID starting with 01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.060837 4703 scope.go:117] "RemoveContainer" containerID="3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.061558 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} err="failed to get container status \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": rpc error: code = NotFound desc = could not find container \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": container with ID starting with 3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.061605 4703 scope.go:117] "RemoveContainer" containerID="75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.062759 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} err="failed to get container status \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": rpc error: code = NotFound desc = could not find container \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": container with ID starting with 75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.062801 4703 scope.go:117] "RemoveContainer" containerID="b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.065723 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} err="failed to get container status \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": rpc error: code = NotFound desc = could not find container \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": container with ID starting with b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.065760 4703 scope.go:117] "RemoveContainer" containerID="9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.066919 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} err="failed to get container status \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": rpc error: code = NotFound desc = could not find container \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": container with ID starting with 9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.067027 4703 scope.go:117] "RemoveContainer" containerID="1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.067656 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} err="failed to get container status \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": rpc error: code = NotFound desc = could not find container \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": container with ID starting with 1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.067694 4703 scope.go:117] "RemoveContainer" containerID="0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.068100 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78"} err="failed to get container status \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": rpc error: code = NotFound desc = could not find container \"0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78\": container with ID starting with 0525900d9aabcb500d03734b7d2af4a9fbc9a3dfbf61f8505db89ad4f7c49f78 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.068139 4703 scope.go:117] "RemoveContainer" containerID="eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.068522 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7"} err="failed to get container status \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": rpc error: code = NotFound desc = could not find container \"eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7\": container with ID starting with eca97170af7033f47fb9e015690ba7f6c1757f73f61af72ee0bbc07859deb0c7 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.068561 4703 scope.go:117] "RemoveContainer" containerID="6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.068999 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255"} err="failed to get container status \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": rpc error: code = NotFound desc = could not find container \"6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255\": container with ID starting with 6ae3c03f32be25ff301dbb82c78027263d090f24a0293c852722f238bda5b255 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.069100 4703 scope.go:117] "RemoveContainer" containerID="19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.069693 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd"} err="failed to get container status \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": rpc error: code = NotFound desc = could not find container \"19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd\": container with ID starting with 19da48531776c9fc7e18c401d041faaefa837aa5349dcc1594eef7673a9079cd not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.069733 4703 scope.go:117] "RemoveContainer" containerID="01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.070288 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9"} err="failed to get container status \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": rpc error: code = NotFound desc = could not find container \"01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9\": container with ID starting with 01e042180d571f6ba7426028b58ebfe60125d4fb506b567a986dc9d71c7a3ea9 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.070328 4703 scope.go:117] "RemoveContainer" containerID="3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.070715 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74"} err="failed to get container status \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": rpc error: code = NotFound desc = could not find container \"3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74\": container with ID starting with 3dab6741fd92d17268b995f661480e602d97fbbecdbaf727da3e6ed07f0eec74 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.070765 4703 scope.go:117] "RemoveContainer" containerID="75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.071255 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c"} err="failed to get container status \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": rpc error: code = NotFound desc = could not find container \"75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c\": container with ID starting with 75a5e2927fe3342514f821bd609c7b6c660f45f3882eb40176e99f6de1431a0c not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.071295 4703 scope.go:117] "RemoveContainer" containerID="b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.071621 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b"} err="failed to get container status \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": rpc error: code = NotFound desc = could not find container \"b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b\": container with ID starting with b400412f006a7dfd177cc742c6e564a62393f1762b59c491e4380eddaa97eb5b not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.071659 4703 scope.go:117] "RemoveContainer" containerID="9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.071987 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14"} err="failed to get container status \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": rpc error: code = NotFound desc = could not find container \"9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14\": container with ID starting with 9549552a0c44026a68565502e103e97a93d78f8fd7c332f8828a1c4a7705cc14 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.072024 4703 scope.go:117] "RemoveContainer" containerID="1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.072384 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6"} err="failed to get container status \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": rpc error: code = NotFound desc = could not find container \"1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6\": container with ID starting with 1607920f4157e81450b01b202f7442ea9e7ece16bb091f94682a2dd86661a8e6 not found: ID does not exist" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.803937 4703 generic.go:334] "Generic (PLEG): container finished" podID="7d4d2124-dfc3-4e4c-be3e-c523d66ed683" containerID="39346462c9f45747e27c6825fc19aa1b489ea7a1977b50e21fb428948c0647fc" exitCode=0 Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.803998 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerDied","Data":"39346462c9f45747e27c6825fc19aa1b489ea7a1977b50e21fb428948c0647fc"} Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.804427 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"c9a11a87c455782e38104e5f0ddb04cac42120a0000252a8f1cea3ace1ecb2d5"} Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.806677 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n9x5k_d59f2278-9dbc-48bb-8d56-fa9da4183118/kube-multus/2.log" Mar 09 13:33:31 crc kubenswrapper[4703]: I0309 13:33:31.806728 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n9x5k" event={"ID":"d59f2278-9dbc-48bb-8d56-fa9da4183118","Type":"ContainerStarted","Data":"bcaf600bb2e6e9c74e2a3307b28c949c33fcdfea72f8acb1cd799b4365050096"} Mar 09 13:33:32 crc kubenswrapper[4703]: I0309 13:33:32.715384 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="650f98b2-73a9-4c73-b0cf-70d3bdd61edd" path="/var/lib/kubelet/pods/650f98b2-73a9-4c73-b0cf-70d3bdd61edd/volumes" Mar 09 13:33:32 crc kubenswrapper[4703]: I0309 13:33:32.822511 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"47a34dbcb9c2a566b55ae1acaa62e80c6fc074bf3da648f05eb71ddea1cc3653"} Mar 09 13:33:32 crc kubenswrapper[4703]: I0309 13:33:32.822591 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"a6d0db9bf62d59f30eebc44fb19905ac0531bed720dc8920ee29e23bcad4ae33"} Mar 09 13:33:32 crc kubenswrapper[4703]: I0309 13:33:32.822622 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"d86b90706fbbf315aca81af53a9d9d276193d168d50dc82b6ae831ba3ceade49"} Mar 09 13:33:32 crc kubenswrapper[4703]: I0309 13:33:32.822654 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"00cf3702152feb73793ec0598698d21325fcc6ad9aeb2d7cc4f4e838dc2dd023"} Mar 09 13:33:32 crc kubenswrapper[4703]: I0309 13:33:32.822678 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"a77cc7192eae40008e08bd8cbf4c37c72ce33804a492e9e00e0a7ec29558b395"} Mar 09 13:33:32 crc kubenswrapper[4703]: I0309 13:33:32.822701 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"284e37e2e13d46ecfbb4bc0028124c4dffb7b9f83ee1cea3ab64040d109189f1"} Mar 09 13:33:35 crc kubenswrapper[4703]: I0309 13:33:35.845020 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"1eecfb2c6c0f293f65c88bc7b9e020cf6ffa66176bb7a62594657b2f5ad35274"} Mar 09 13:33:37 crc kubenswrapper[4703]: I0309 13:33:37.859602 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" event={"ID":"7d4d2124-dfc3-4e4c-be3e-c523d66ed683","Type":"ContainerStarted","Data":"ee679d9c59dd7393f677749d1226864b7606b39a37adbececd4adeb86d4d00c9"} Mar 09 13:33:37 crc kubenswrapper[4703]: I0309 13:33:37.860168 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:37 crc kubenswrapper[4703]: I0309 13:33:37.860180 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:37 crc kubenswrapper[4703]: I0309 13:33:37.881551 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:37 crc kubenswrapper[4703]: I0309 13:33:37.901583 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" podStartSLOduration=7.901558814 podStartE2EDuration="7.901558814s" podCreationTimestamp="2026-03-09 13:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:33:37.898290544 +0000 UTC m=+813.865706230" watchObservedRunningTime="2026-03-09 13:33:37.901558814 +0000 UTC m=+813.868974520" Mar 09 13:33:38 crc kubenswrapper[4703]: I0309 13:33:38.865942 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:38 crc kubenswrapper[4703]: I0309 13:33:38.891596 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:33:39 crc kubenswrapper[4703]: I0309 13:33:39.499930 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:33:39 crc kubenswrapper[4703]: I0309 13:33:39.500032 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:33:55 crc kubenswrapper[4703]: I0309 13:33:55.608358 4703 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.441349 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m"] Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.443265 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.446718 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.456223 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m"] Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.592913 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.593034 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.593086 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9lgd\" (UniqueName: \"kubernetes.io/projected/2f48ca74-d2f2-4baf-a448-e980848ac419-kube-api-access-s9lgd\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.694269 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.694361 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9lgd\" (UniqueName: \"kubernetes.io/projected/2f48ca74-d2f2-4baf-a448-e980848ac419-kube-api-access-s9lgd\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.694470 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.695198 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.695290 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.732259 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9lgd\" (UniqueName: \"kubernetes.io/projected/2f48ca74-d2f2-4baf-a448-e980848ac419-kube-api-access-s9lgd\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:56 crc kubenswrapper[4703]: I0309 13:33:56.781712 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:33:57 crc kubenswrapper[4703]: I0309 13:33:57.259833 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m"] Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:57.999978 4703 generic.go:334] "Generic (PLEG): container finished" podID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerID="04512086bb09f7188e6951e90cbf229d84e0bd519915aaeb5373fd028d1b9f44" exitCode=0 Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.000028 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" event={"ID":"2f48ca74-d2f2-4baf-a448-e980848ac419","Type":"ContainerDied","Data":"04512086bb09f7188e6951e90cbf229d84e0bd519915aaeb5373fd028d1b9f44"} Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.000064 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" event={"ID":"2f48ca74-d2f2-4baf-a448-e980848ac419","Type":"ContainerStarted","Data":"6fa44cbb92a1544d6d5740300a40fb224fe80947cb0c48a4f24620bf08ab459c"} Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.775014 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kx266"] Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.776586 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.788263 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kx266"] Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.924698 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-catalog-content\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.924768 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-utilities\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:58 crc kubenswrapper[4703]: I0309 13:33:58.924802 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvn6\" (UniqueName: \"kubernetes.io/projected/4696f44c-9b19-4bb0-9232-c0fcdc101439-kube-api-access-ntvn6\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.026119 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-utilities\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.026722 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-catalog-content\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.026862 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvn6\" (UniqueName: \"kubernetes.io/projected/4696f44c-9b19-4bb0-9232-c0fcdc101439-kube-api-access-ntvn6\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.027016 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-catalog-content\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.026621 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-utilities\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.050828 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvn6\" (UniqueName: \"kubernetes.io/projected/4696f44c-9b19-4bb0-9232-c0fcdc101439-kube-api-access-ntvn6\") pod \"redhat-operators-kx266\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.105005 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:33:59 crc kubenswrapper[4703]: I0309 13:33:59.300313 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kx266"] Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.011837 4703 generic.go:334] "Generic (PLEG): container finished" podID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerID="31f04041624cfb1c3dc6d8a4782b0d6b378be84be84d0d8bba349d2b1c049afc" exitCode=0 Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.011903 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" event={"ID":"2f48ca74-d2f2-4baf-a448-e980848ac419","Type":"ContainerDied","Data":"31f04041624cfb1c3dc6d8a4782b0d6b378be84be84d0d8bba349d2b1c049afc"} Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.014012 4703 generic.go:334] "Generic (PLEG): container finished" podID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerID="7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850" exitCode=0 Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.014050 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx266" event={"ID":"4696f44c-9b19-4bb0-9232-c0fcdc101439","Type":"ContainerDied","Data":"7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850"} Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.014074 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx266" event={"ID":"4696f44c-9b19-4bb0-9232-c0fcdc101439","Type":"ContainerStarted","Data":"262d34da6fce5e2075934ac9dcdf0fb010e139e5fd0496504353531c44e84a66"} Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.130956 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551054-nt48w"] Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.131691 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-nt48w" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.138307 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.139272 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcr99\" (UniqueName: \"kubernetes.io/projected/bb09a646-e855-46a3-8091-333d81ef7c8f-kube-api-access-jcr99\") pod \"auto-csr-approver-29551054-nt48w\" (UID: \"bb09a646-e855-46a3-8091-333d81ef7c8f\") " pod="openshift-infra/auto-csr-approver-29551054-nt48w" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.139832 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.140002 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.145465 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-nt48w"] Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.240160 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcr99\" (UniqueName: \"kubernetes.io/projected/bb09a646-e855-46a3-8091-333d81ef7c8f-kube-api-access-jcr99\") pod \"auto-csr-approver-29551054-nt48w\" (UID: \"bb09a646-e855-46a3-8091-333d81ef7c8f\") " pod="openshift-infra/auto-csr-approver-29551054-nt48w" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.268790 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcr99\" (UniqueName: \"kubernetes.io/projected/bb09a646-e855-46a3-8091-333d81ef7c8f-kube-api-access-jcr99\") pod \"auto-csr-approver-29551054-nt48w\" (UID: \"bb09a646-e855-46a3-8091-333d81ef7c8f\") " pod="openshift-infra/auto-csr-approver-29551054-nt48w" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.453943 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-nt48w" Mar 09 13:34:00 crc kubenswrapper[4703]: I0309 13:34:00.677394 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-nt48w"] Mar 09 13:34:00 crc kubenswrapper[4703]: W0309 13:34:00.681154 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb09a646_e855_46a3_8091_333d81ef7c8f.slice/crio-f8558ec5239ce47a64fc0d0a687667586924285fda0fa463a26297503e0fab39 WatchSource:0}: Error finding container f8558ec5239ce47a64fc0d0a687667586924285fda0fa463a26297503e0fab39: Status 404 returned error can't find the container with id f8558ec5239ce47a64fc0d0a687667586924285fda0fa463a26297503e0fab39 Mar 09 13:34:01 crc kubenswrapper[4703]: I0309 13:34:01.002542 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v7q7n" Mar 09 13:34:01 crc kubenswrapper[4703]: I0309 13:34:01.022527 4703 generic.go:334] "Generic (PLEG): container finished" podID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerID="9816ef0b15369c467559279bcaf0e68fee3dbecaedc2c3e9465bfa6cfb242d3e" exitCode=0 Mar 09 13:34:01 crc kubenswrapper[4703]: I0309 13:34:01.022631 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" event={"ID":"2f48ca74-d2f2-4baf-a448-e980848ac419","Type":"ContainerDied","Data":"9816ef0b15369c467559279bcaf0e68fee3dbecaedc2c3e9465bfa6cfb242d3e"} Mar 09 13:34:01 crc kubenswrapper[4703]: I0309 13:34:01.023878 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-nt48w" event={"ID":"bb09a646-e855-46a3-8091-333d81ef7c8f","Type":"ContainerStarted","Data":"f8558ec5239ce47a64fc0d0a687667586924285fda0fa463a26297503e0fab39"} Mar 09 13:34:01 crc kubenswrapper[4703]: I0309 13:34:01.025999 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx266" event={"ID":"4696f44c-9b19-4bb0-9232-c0fcdc101439","Type":"ContainerStarted","Data":"d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76"} Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.037718 4703 generic.go:334] "Generic (PLEG): container finished" podID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerID="d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76" exitCode=0 Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.037834 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx266" event={"ID":"4696f44c-9b19-4bb0-9232-c0fcdc101439","Type":"ContainerDied","Data":"d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76"} Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.041230 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-nt48w" event={"ID":"bb09a646-e855-46a3-8091-333d81ef7c8f","Type":"ContainerStarted","Data":"0f4e9edfbc9aad297ec87e9b5f4c229b40c2f094d58b5abe2a633c3836a42535"} Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.096369 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551054-nt48w" podStartSLOduration=1.109962235 podStartE2EDuration="2.096347317s" podCreationTimestamp="2026-03-09 13:34:00 +0000 UTC" firstStartedPulling="2026-03-09 13:34:00.696681936 +0000 UTC m=+836.664097632" lastFinishedPulling="2026-03-09 13:34:01.683066998 +0000 UTC m=+837.650482714" observedRunningTime="2026-03-09 13:34:02.08948314 +0000 UTC m=+838.056898836" watchObservedRunningTime="2026-03-09 13:34:02.096347317 +0000 UTC m=+838.063763023" Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.338018 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.370836 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9lgd\" (UniqueName: \"kubernetes.io/projected/2f48ca74-d2f2-4baf-a448-e980848ac419-kube-api-access-s9lgd\") pod \"2f48ca74-d2f2-4baf-a448-e980848ac419\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.370926 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-util\") pod \"2f48ca74-d2f2-4baf-a448-e980848ac419\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.370999 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-bundle\") pod \"2f48ca74-d2f2-4baf-a448-e980848ac419\" (UID: \"2f48ca74-d2f2-4baf-a448-e980848ac419\") " Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.372256 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-bundle" (OuterVolumeSpecName: "bundle") pod "2f48ca74-d2f2-4baf-a448-e980848ac419" (UID: "2f48ca74-d2f2-4baf-a448-e980848ac419"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.372517 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.380045 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f48ca74-d2f2-4baf-a448-e980848ac419-kube-api-access-s9lgd" (OuterVolumeSpecName: "kube-api-access-s9lgd") pod "2f48ca74-d2f2-4baf-a448-e980848ac419" (UID: "2f48ca74-d2f2-4baf-a448-e980848ac419"). InnerVolumeSpecName "kube-api-access-s9lgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.387372 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-util" (OuterVolumeSpecName: "util") pod "2f48ca74-d2f2-4baf-a448-e980848ac419" (UID: "2f48ca74-d2f2-4baf-a448-e980848ac419"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.474205 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9lgd\" (UniqueName: \"kubernetes.io/projected/2f48ca74-d2f2-4baf-a448-e980848ac419-kube-api-access-s9lgd\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:02 crc kubenswrapper[4703]: I0309 13:34:02.474256 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f48ca74-d2f2-4baf-a448-e980848ac419-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:03 crc kubenswrapper[4703]: I0309 13:34:03.050972 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" event={"ID":"2f48ca74-d2f2-4baf-a448-e980848ac419","Type":"ContainerDied","Data":"6fa44cbb92a1544d6d5740300a40fb224fe80947cb0c48a4f24620bf08ab459c"} Mar 09 13:34:03 crc kubenswrapper[4703]: I0309 13:34:03.051555 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa44cbb92a1544d6d5740300a40fb224fe80947cb0c48a4f24620bf08ab459c" Mar 09 13:34:03 crc kubenswrapper[4703]: I0309 13:34:03.051016 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m" Mar 09 13:34:03 crc kubenswrapper[4703]: I0309 13:34:03.054254 4703 generic.go:334] "Generic (PLEG): container finished" podID="bb09a646-e855-46a3-8091-333d81ef7c8f" containerID="0f4e9edfbc9aad297ec87e9b5f4c229b40c2f094d58b5abe2a633c3836a42535" exitCode=0 Mar 09 13:34:03 crc kubenswrapper[4703]: I0309 13:34:03.054316 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-nt48w" event={"ID":"bb09a646-e855-46a3-8091-333d81ef7c8f","Type":"ContainerDied","Data":"0f4e9edfbc9aad297ec87e9b5f4c229b40c2f094d58b5abe2a633c3836a42535"} Mar 09 13:34:04 crc kubenswrapper[4703]: I0309 13:34:04.066103 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx266" event={"ID":"4696f44c-9b19-4bb0-9232-c0fcdc101439","Type":"ContainerStarted","Data":"b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86"} Mar 09 13:34:04 crc kubenswrapper[4703]: I0309 13:34:04.104612 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kx266" podStartSLOduration=2.567697822 podStartE2EDuration="6.104588823s" podCreationTimestamp="2026-03-09 13:33:58 +0000 UTC" firstStartedPulling="2026-03-09 13:34:00.016157411 +0000 UTC m=+835.983573097" lastFinishedPulling="2026-03-09 13:34:03.553048372 +0000 UTC m=+839.520464098" observedRunningTime="2026-03-09 13:34:04.0978884 +0000 UTC m=+840.065304136" watchObservedRunningTime="2026-03-09 13:34:04.104588823 +0000 UTC m=+840.072004539" Mar 09 13:34:04 crc kubenswrapper[4703]: I0309 13:34:04.360070 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-nt48w" Mar 09 13:34:04 crc kubenswrapper[4703]: I0309 13:34:04.503217 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcr99\" (UniqueName: \"kubernetes.io/projected/bb09a646-e855-46a3-8091-333d81ef7c8f-kube-api-access-jcr99\") pod \"bb09a646-e855-46a3-8091-333d81ef7c8f\" (UID: \"bb09a646-e855-46a3-8091-333d81ef7c8f\") " Mar 09 13:34:04 crc kubenswrapper[4703]: I0309 13:34:04.511123 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb09a646-e855-46a3-8091-333d81ef7c8f-kube-api-access-jcr99" (OuterVolumeSpecName: "kube-api-access-jcr99") pod "bb09a646-e855-46a3-8091-333d81ef7c8f" (UID: "bb09a646-e855-46a3-8091-333d81ef7c8f"). InnerVolumeSpecName "kube-api-access-jcr99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:04 crc kubenswrapper[4703]: I0309 13:34:04.604755 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcr99\" (UniqueName: \"kubernetes.io/projected/bb09a646-e855-46a3-8091-333d81ef7c8f-kube-api-access-jcr99\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:05 crc kubenswrapper[4703]: I0309 13:34:05.075508 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551054-nt48w" Mar 09 13:34:05 crc kubenswrapper[4703]: I0309 13:34:05.075830 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551054-nt48w" event={"ID":"bb09a646-e855-46a3-8091-333d81ef7c8f","Type":"ContainerDied","Data":"f8558ec5239ce47a64fc0d0a687667586924285fda0fa463a26297503e0fab39"} Mar 09 13:34:05 crc kubenswrapper[4703]: I0309 13:34:05.076017 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8558ec5239ce47a64fc0d0a687667586924285fda0fa463a26297503e0fab39" Mar 09 13:34:05 crc kubenswrapper[4703]: I0309 13:34:05.456975 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-2kpzv"] Mar 09 13:34:05 crc kubenswrapper[4703]: I0309 13:34:05.461984 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551048-2kpzv"] Mar 09 13:34:06 crc kubenswrapper[4703]: I0309 13:34:06.714162 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5b1542-68c7-406f-8350-a4c25bf83ea1" path="/var/lib/kubelet/pods/7a5b1542-68c7-406f-8350-a4c25bf83ea1/volumes" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.967302 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ddrl5"] Mar 09 13:34:08 crc kubenswrapper[4703]: E0309 13:34:08.967635 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb09a646-e855-46a3-8091-333d81ef7c8f" containerName="oc" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.967658 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb09a646-e855-46a3-8091-333d81ef7c8f" containerName="oc" Mar 09 13:34:08 crc kubenswrapper[4703]: E0309 13:34:08.967675 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerName="pull" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.967690 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerName="pull" Mar 09 13:34:08 crc kubenswrapper[4703]: E0309 13:34:08.967711 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerName="extract" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.967723 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerName="extract" Mar 09 13:34:08 crc kubenswrapper[4703]: E0309 13:34:08.967778 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerName="util" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.967789 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerName="util" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.967995 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb09a646-e855-46a3-8091-333d81ef7c8f" containerName="oc" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.968020 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f48ca74-d2f2-4baf-a448-e980848ac419" containerName="extract" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.969404 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:08 crc kubenswrapper[4703]: I0309 13:34:08.982398 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ddrl5"] Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.094430 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-catalog-content\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.094509 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-utilities\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.094707 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4r9\" (UniqueName: \"kubernetes.io/projected/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-kube-api-access-zw4r9\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.105452 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.105608 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.196564 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-catalog-content\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.196649 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-utilities\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.196693 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4r9\" (UniqueName: \"kubernetes.io/projected/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-kube-api-access-zw4r9\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.197185 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-catalog-content\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.197231 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-utilities\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.214884 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4r9\" (UniqueName: \"kubernetes.io/projected/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-kube-api-access-zw4r9\") pod \"certified-operators-ddrl5\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.285975 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.500126 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.501091 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.501146 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.502141 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a30d8d9d6908c22df3252ca1b072edab287dd75aa2220dbdad578c0ef22ecae"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.502193 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://9a30d8d9d6908c22df3252ca1b072edab287dd75aa2220dbdad578c0ef22ecae" gracePeriod=600 Mar 09 13:34:09 crc kubenswrapper[4703]: I0309 13:34:09.514988 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ddrl5"] Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.108271 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="9a30d8d9d6908c22df3252ca1b072edab287dd75aa2220dbdad578c0ef22ecae" exitCode=0 Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.108344 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"9a30d8d9d6908c22df3252ca1b072edab287dd75aa2220dbdad578c0ef22ecae"} Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.108733 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"5f24d1f6f12dd9f2dbe4d447d5f4ec4a161023da53dbcd799df4861c45de4b0c"} Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.108753 4703 scope.go:117] "RemoveContainer" containerID="6173caf8bb9b858c40be1c304f6c4689d83a93c7a582d8cb68f949346af67630" Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.113458 4703 generic.go:334] "Generic (PLEG): container finished" podID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerID="0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2" exitCode=0 Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.114533 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddrl5" event={"ID":"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0","Type":"ContainerDied","Data":"0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2"} Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.114563 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddrl5" event={"ID":"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0","Type":"ContainerStarted","Data":"97cb1243384ea92ac3298721fdb93454b4b5282990688bc1c9d36489d306df4a"} Mar 09 13:34:10 crc kubenswrapper[4703]: I0309 13:34:10.149426 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kx266" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="registry-server" probeResult="failure" output=< Mar 09 13:34:10 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:34:10 crc kubenswrapper[4703]: > Mar 09 13:34:12 crc kubenswrapper[4703]: I0309 13:34:12.131226 4703 generic.go:334] "Generic (PLEG): container finished" podID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerID="66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be" exitCode=0 Mar 09 13:34:12 crc kubenswrapper[4703]: I0309 13:34:12.131293 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddrl5" event={"ID":"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0","Type":"ContainerDied","Data":"66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be"} Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.150454 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddrl5" event={"ID":"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0","Type":"ContainerStarted","Data":"b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149"} Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.191421 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ddrl5" podStartSLOduration=2.769181845 podStartE2EDuration="5.191406241s" podCreationTimestamp="2026-03-09 13:34:08 +0000 UTC" firstStartedPulling="2026-03-09 13:34:10.11541115 +0000 UTC m=+846.082826836" lastFinishedPulling="2026-03-09 13:34:12.537635546 +0000 UTC m=+848.505051232" observedRunningTime="2026-03-09 13:34:13.187419171 +0000 UTC m=+849.154834867" watchObservedRunningTime="2026-03-09 13:34:13.191406241 +0000 UTC m=+849.158821927" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.516283 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr"] Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.516918 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.519493 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.519973 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.519978 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-t7gn2" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.520114 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.520178 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.534415 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr"] Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.548809 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvgz\" (UniqueName: \"kubernetes.io/projected/cad59883-2357-4002-a757-689c894f9c33-kube-api-access-vlvgz\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.548893 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cad59883-2357-4002-a757-689c894f9c33-apiservice-cert\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.548921 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cad59883-2357-4002-a757-689c894f9c33-webhook-cert\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.649858 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cad59883-2357-4002-a757-689c894f9c33-apiservice-cert\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.649906 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cad59883-2357-4002-a757-689c894f9c33-webhook-cert\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.649963 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlvgz\" (UniqueName: \"kubernetes.io/projected/cad59883-2357-4002-a757-689c894f9c33-kube-api-access-vlvgz\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.655336 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cad59883-2357-4002-a757-689c894f9c33-webhook-cert\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.655386 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cad59883-2357-4002-a757-689c894f9c33-apiservice-cert\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.665478 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlvgz\" (UniqueName: \"kubernetes.io/projected/cad59883-2357-4002-a757-689c894f9c33-kube-api-access-vlvgz\") pod \"metallb-operator-controller-manager-6c588f5d89-l7mpr\" (UID: \"cad59883-2357-4002-a757-689c894f9c33\") " pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.748647 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz"] Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.749317 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.752629 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.752723 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.752819 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pb7hj" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.768702 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz"] Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.829702 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.851627 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m256s\" (UniqueName: \"kubernetes.io/projected/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-kube-api-access-m256s\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.851746 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-apiservice-cert\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.851779 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-webhook-cert\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.953081 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m256s\" (UniqueName: \"kubernetes.io/projected/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-kube-api-access-m256s\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.953427 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-apiservice-cert\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.953448 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-webhook-cert\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.969509 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-apiservice-cert\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.984399 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-webhook-cert\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:13 crc kubenswrapper[4703]: I0309 13:34:13.991863 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m256s\" (UniqueName: \"kubernetes.io/projected/f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47-kube-api-access-m256s\") pod \"metallb-operator-webhook-server-5bcdd89498-pjlcz\" (UID: \"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47\") " pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:14 crc kubenswrapper[4703]: I0309 13:34:14.064407 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:14 crc kubenswrapper[4703]: I0309 13:34:14.070790 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr"] Mar 09 13:34:14 crc kubenswrapper[4703]: W0309 13:34:14.082944 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad59883_2357_4002_a757_689c894f9c33.slice/crio-0e167c1306733e41cf3203cac7e3dd5448dc0b2163d3e0948451426d0ab053e6 WatchSource:0}: Error finding container 0e167c1306733e41cf3203cac7e3dd5448dc0b2163d3e0948451426d0ab053e6: Status 404 returned error can't find the container with id 0e167c1306733e41cf3203cac7e3dd5448dc0b2163d3e0948451426d0ab053e6 Mar 09 13:34:14 crc kubenswrapper[4703]: I0309 13:34:14.161641 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" event={"ID":"cad59883-2357-4002-a757-689c894f9c33","Type":"ContainerStarted","Data":"0e167c1306733e41cf3203cac7e3dd5448dc0b2163d3e0948451426d0ab053e6"} Mar 09 13:34:14 crc kubenswrapper[4703]: I0309 13:34:14.322764 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz"] Mar 09 13:34:14 crc kubenswrapper[4703]: W0309 13:34:14.326254 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2cf06d7_6a45_4d68_9a2e_ef6bcf831e47.slice/crio-a43e4ef869e068e67ed90b576bcd4c66f36e9caf178ac82dfe8019a697a01be4 WatchSource:0}: Error finding container a43e4ef869e068e67ed90b576bcd4c66f36e9caf178ac82dfe8019a697a01be4: Status 404 returned error can't find the container with id a43e4ef869e068e67ed90b576bcd4c66f36e9caf178ac82dfe8019a697a01be4 Mar 09 13:34:15 crc kubenswrapper[4703]: I0309 13:34:15.167631 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" event={"ID":"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47","Type":"ContainerStarted","Data":"a43e4ef869e068e67ed90b576bcd4c66f36e9caf178ac82dfe8019a697a01be4"} Mar 09 13:34:19 crc kubenswrapper[4703]: I0309 13:34:19.140202 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:34:19 crc kubenswrapper[4703]: I0309 13:34:19.179084 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:34:19 crc kubenswrapper[4703]: I0309 13:34:19.286116 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:19 crc kubenswrapper[4703]: I0309 13:34:19.286436 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:19 crc kubenswrapper[4703]: I0309 13:34:19.328252 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:19 crc kubenswrapper[4703]: I0309 13:34:19.378980 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kx266"] Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.196670 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" event={"ID":"f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47","Type":"ContainerStarted","Data":"68dea48bf5f6340869f010538f15378495d29a81498d3f8f39158ad6128a6b94"} Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.196751 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.197914 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" event={"ID":"cad59883-2357-4002-a757-689c894f9c33","Type":"ContainerStarted","Data":"c27fb926702e4b10b8fa092206a7b77e0a73e40fede7cece7ed05a5be6338e9c"} Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.198079 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kx266" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="registry-server" containerID="cri-o://b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86" gracePeriod=2 Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.217195 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" podStartSLOduration=2.3563695989999998 podStartE2EDuration="7.217177284s" podCreationTimestamp="2026-03-09 13:34:13 +0000 UTC" firstStartedPulling="2026-03-09 13:34:14.328904017 +0000 UTC m=+850.296319703" lastFinishedPulling="2026-03-09 13:34:19.189711702 +0000 UTC m=+855.157127388" observedRunningTime="2026-03-09 13:34:20.216404803 +0000 UTC m=+856.183820509" watchObservedRunningTime="2026-03-09 13:34:20.217177284 +0000 UTC m=+856.184592970" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.238768 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" podStartSLOduration=2.1554239060000002 podStartE2EDuration="7.238746123s" podCreationTimestamp="2026-03-09 13:34:13 +0000 UTC" firstStartedPulling="2026-03-09 13:34:14.095298887 +0000 UTC m=+850.062714573" lastFinishedPulling="2026-03-09 13:34:19.178621104 +0000 UTC m=+855.146036790" observedRunningTime="2026-03-09 13:34:20.235926104 +0000 UTC m=+856.203341800" watchObservedRunningTime="2026-03-09 13:34:20.238746123 +0000 UTC m=+856.206161809" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.256341 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.586332 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.653783 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntvn6\" (UniqueName: \"kubernetes.io/projected/4696f44c-9b19-4bb0-9232-c0fcdc101439-kube-api-access-ntvn6\") pod \"4696f44c-9b19-4bb0-9232-c0fcdc101439\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.653915 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-catalog-content\") pod \"4696f44c-9b19-4bb0-9232-c0fcdc101439\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.653990 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-utilities\") pod \"4696f44c-9b19-4bb0-9232-c0fcdc101439\" (UID: \"4696f44c-9b19-4bb0-9232-c0fcdc101439\") " Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.655294 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-utilities" (OuterVolumeSpecName: "utilities") pod "4696f44c-9b19-4bb0-9232-c0fcdc101439" (UID: "4696f44c-9b19-4bb0-9232-c0fcdc101439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.675665 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4696f44c-9b19-4bb0-9232-c0fcdc101439-kube-api-access-ntvn6" (OuterVolumeSpecName: "kube-api-access-ntvn6") pod "4696f44c-9b19-4bb0-9232-c0fcdc101439" (UID: "4696f44c-9b19-4bb0-9232-c0fcdc101439"). InnerVolumeSpecName "kube-api-access-ntvn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.755551 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntvn6\" (UniqueName: \"kubernetes.io/projected/4696f44c-9b19-4bb0-9232-c0fcdc101439-kube-api-access-ntvn6\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.755605 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.798178 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4696f44c-9b19-4bb0-9232-c0fcdc101439" (UID: "4696f44c-9b19-4bb0-9232-c0fcdc101439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:20 crc kubenswrapper[4703]: I0309 13:34:20.856920 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696f44c-9b19-4bb0-9232-c0fcdc101439-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.207222 4703 generic.go:334] "Generic (PLEG): container finished" podID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerID="b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86" exitCode=0 Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.207269 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx266" event={"ID":"4696f44c-9b19-4bb0-9232-c0fcdc101439","Type":"ContainerDied","Data":"b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86"} Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.207335 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kx266" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.207351 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kx266" event={"ID":"4696f44c-9b19-4bb0-9232-c0fcdc101439","Type":"ContainerDied","Data":"262d34da6fce5e2075934ac9dcdf0fb010e139e5fd0496504353531c44e84a66"} Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.207383 4703 scope.go:117] "RemoveContainer" containerID="b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.207543 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.230684 4703 scope.go:117] "RemoveContainer" containerID="d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.261360 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kx266"] Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.262764 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kx266"] Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.270441 4703 scope.go:117] "RemoveContainer" containerID="7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.291310 4703 scope.go:117] "RemoveContainer" containerID="b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86" Mar 09 13:34:21 crc kubenswrapper[4703]: E0309 13:34:21.291867 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86\": container with ID starting with b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86 not found: ID does not exist" containerID="b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.291903 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86"} err="failed to get container status \"b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86\": rpc error: code = NotFound desc = could not find container \"b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86\": container with ID starting with b8b128c5f499d2caea8829869e14d530ba7abecf9689fbedcaf9a0a95c360c86 not found: ID does not exist" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.291928 4703 scope.go:117] "RemoveContainer" containerID="d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76" Mar 09 13:34:21 crc kubenswrapper[4703]: E0309 13:34:21.292236 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76\": container with ID starting with d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76 not found: ID does not exist" containerID="d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.292260 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76"} err="failed to get container status \"d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76\": rpc error: code = NotFound desc = could not find container \"d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76\": container with ID starting with d82eb003ad96fb65ac4ba6c5daeab2f6421134ed8f57081e529223b29572cf76 not found: ID does not exist" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.292277 4703 scope.go:117] "RemoveContainer" containerID="7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850" Mar 09 13:34:21 crc kubenswrapper[4703]: E0309 13:34:21.292454 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850\": container with ID starting with 7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850 not found: ID does not exist" containerID="7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.292480 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850"} err="failed to get container status \"7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850\": rpc error: code = NotFound desc = could not find container \"7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850\": container with ID starting with 7f4c01113b7676da674d69ec5ee8f102599bccbd9bf35e97bee7a971d1ed3850 not found: ID does not exist" Mar 09 13:34:21 crc kubenswrapper[4703]: I0309 13:34:21.579698 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ddrl5"] Mar 09 13:34:22 crc kubenswrapper[4703]: I0309 13:34:22.716917 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" path="/var/lib/kubelet/pods/4696f44c-9b19-4bb0-9232-c0fcdc101439/volumes" Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.227090 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ddrl5" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="registry-server" containerID="cri-o://b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149" gracePeriod=2 Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.672341 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.805498 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw4r9\" (UniqueName: \"kubernetes.io/projected/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-kube-api-access-zw4r9\") pod \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.806020 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-utilities\") pod \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.806189 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-catalog-content\") pod \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\" (UID: \"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0\") " Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.807746 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-utilities" (OuterVolumeSpecName: "utilities") pod "1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" (UID: "1c6354ff-fc9a-45b0-ba6c-c1597709a0d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.812204 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-kube-api-access-zw4r9" (OuterVolumeSpecName: "kube-api-access-zw4r9") pod "1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" (UID: "1c6354ff-fc9a-45b0-ba6c-c1597709a0d0"). InnerVolumeSpecName "kube-api-access-zw4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.878992 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" (UID: "1c6354ff-fc9a-45b0-ba6c-c1597709a0d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.907951 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw4r9\" (UniqueName: \"kubernetes.io/projected/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-kube-api-access-zw4r9\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.907983 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:23 crc kubenswrapper[4703]: I0309 13:34:23.907995 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.238620 4703 generic.go:334] "Generic (PLEG): container finished" podID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerID="b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149" exitCode=0 Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.238686 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddrl5" event={"ID":"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0","Type":"ContainerDied","Data":"b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149"} Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.238726 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddrl5" event={"ID":"1c6354ff-fc9a-45b0-ba6c-c1597709a0d0","Type":"ContainerDied","Data":"97cb1243384ea92ac3298721fdb93454b4b5282990688bc1c9d36489d306df4a"} Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.238727 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddrl5" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.238828 4703 scope.go:117] "RemoveContainer" containerID="b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.272431 4703 scope.go:117] "RemoveContainer" containerID="66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.311065 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ddrl5"] Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.311176 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ddrl5"] Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.313141 4703 scope.go:117] "RemoveContainer" containerID="0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.348954 4703 scope.go:117] "RemoveContainer" containerID="b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149" Mar 09 13:34:24 crc kubenswrapper[4703]: E0309 13:34:24.349804 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149\": container with ID starting with b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149 not found: ID does not exist" containerID="b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.349903 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149"} err="failed to get container status \"b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149\": rpc error: code = NotFound desc = could not find container \"b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149\": container with ID starting with b11013f1268e85bf89cca2dfc046dcae0fd2e06b5dea7dfe66e38fcbd8665149 not found: ID does not exist" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.349946 4703 scope.go:117] "RemoveContainer" containerID="66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be" Mar 09 13:34:24 crc kubenswrapper[4703]: E0309 13:34:24.350376 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be\": container with ID starting with 66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be not found: ID does not exist" containerID="66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.350430 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be"} err="failed to get container status \"66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be\": rpc error: code = NotFound desc = could not find container \"66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be\": container with ID starting with 66f63318ce6a9c0b04967c572f3319124eacf23be2fdc490487868aadd7197be not found: ID does not exist" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.350465 4703 scope.go:117] "RemoveContainer" containerID="0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2" Mar 09 13:34:24 crc kubenswrapper[4703]: E0309 13:34:24.350733 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2\": container with ID starting with 0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2 not found: ID does not exist" containerID="0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.350764 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2"} err="failed to get container status \"0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2\": rpc error: code = NotFound desc = could not find container \"0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2\": container with ID starting with 0b60a7e59b9f7b9f9cd3a155d069c2a6c0860c1985eb0735780719a60c92d1a2 not found: ID does not exist" Mar 09 13:34:24 crc kubenswrapper[4703]: I0309 13:34:24.713469 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" path="/var/lib/kubelet/pods/1c6354ff-fc9a-45b0-ba6c-c1597709a0d0/volumes" Mar 09 13:34:26 crc kubenswrapper[4703]: I0309 13:34:26.009814 4703 scope.go:117] "RemoveContainer" containerID="c2d7e62e3cd6c1d43bd865acf82212b18828f4ae68f49464aa1a41c4ae9950e7" Mar 09 13:34:34 crc kubenswrapper[4703]: I0309 13:34:34.071039 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5bcdd89498-pjlcz" Mar 09 13:34:53 crc kubenswrapper[4703]: I0309 13:34:53.835762 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6c588f5d89-l7mpr" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660272 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj"] Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.660628 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="extract-utilities" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660656 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="extract-utilities" Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.660672 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="extract-utilities" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660684 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="extract-utilities" Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.660707 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="registry-server" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660719 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="registry-server" Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.660732 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="extract-content" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660744 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="extract-content" Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.660762 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="extract-content" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660773 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="extract-content" Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.660791 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="registry-server" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660801 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="registry-server" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.660978 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="4696f44c-9b19-4bb0-9232-c0fcdc101439" containerName="registry-server" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.661008 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6354ff-fc9a-45b0-ba6c-c1597709a0d0" containerName="registry-server" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.661638 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.664481 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-plg74" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.664912 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xrrnq"] Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.665120 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.667639 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.670162 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.670328 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.682914 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj"] Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.746127 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qlmqf"] Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.746956 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.751891 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.751940 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tx2t7" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.752209 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.752389 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.761227 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-htlzv"] Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.762269 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.763670 4703 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.775777 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-htlzv"] Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.832538 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmp92\" (UniqueName: \"kubernetes.io/projected/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-kube-api-access-hmp92\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.832603 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-sockets\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.832623 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jwp5\" (UniqueName: \"kubernetes.io/projected/a7841ae2-e705-49ff-a7f9-83fc45d05454-kube-api-access-4jwp5\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.832641 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-metrics-certs\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.833565 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a7841ae2-e705-49ff-a7f9-83fc45d05454-metallb-excludel2\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.833675 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-metrics\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.834062 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q444\" (UniqueName: \"kubernetes.io/projected/7941f29f-2aed-45e8-b9c3-d4e0c19573ee-kube-api-access-9q444\") pod \"frr-k8s-webhook-server-7f989f654f-mqpjj\" (UID: \"7941f29f-2aed-45e8-b9c3-d4e0c19573ee\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.834097 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-startup\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.834140 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-metrics-certs\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.834161 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-reloader\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.834187 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7941f29f-2aed-45e8-b9c3-d4e0c19573ee-cert\") pod \"frr-k8s-webhook-server-7f989f654f-mqpjj\" (UID: \"7941f29f-2aed-45e8-b9c3-d4e0c19573ee\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.834203 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-conf\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.834223 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935547 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-metrics\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935641 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q444\" (UniqueName: \"kubernetes.io/projected/7941f29f-2aed-45e8-b9c3-d4e0c19573ee-kube-api-access-9q444\") pod \"frr-k8s-webhook-server-7f989f654f-mqpjj\" (UID: \"7941f29f-2aed-45e8-b9c3-d4e0c19573ee\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935677 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-startup\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935714 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-metrics-certs\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935746 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-reloader\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935788 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-metrics-certs\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935821 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-conf\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935875 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7941f29f-2aed-45e8-b9c3-d4e0c19573ee-cert\") pod \"frr-k8s-webhook-server-7f989f654f-mqpjj\" (UID: \"7941f29f-2aed-45e8-b9c3-d4e0c19573ee\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935912 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935959 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmp92\" (UniqueName: \"kubernetes.io/projected/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-kube-api-access-hmp92\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.935993 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-cert\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936043 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49bk\" (UniqueName: \"kubernetes.io/projected/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-kube-api-access-g49bk\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936075 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-sockets\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936106 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jwp5\" (UniqueName: \"kubernetes.io/projected/a7841ae2-e705-49ff-a7f9-83fc45d05454-kube-api-access-4jwp5\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936142 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-metrics-certs\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936157 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-metrics\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936183 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a7841ae2-e705-49ff-a7f9-83fc45d05454-metallb-excludel2\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936292 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-reloader\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.936315 4703 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:34:54 crc kubenswrapper[4703]: E0309 13:34:54.936379 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist podName:a7841ae2-e705-49ff-a7f9-83fc45d05454 nodeName:}" failed. No retries permitted until 2026-03-09 13:34:55.436359984 +0000 UTC m=+891.403775770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist") pod "speaker-qlmqf" (UID: "a7841ae2-e705-49ff-a7f9-83fc45d05454") : secret "metallb-memberlist" not found Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936781 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-conf\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.936784 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-sockets\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.937209 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-frr-startup\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.937354 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a7841ae2-e705-49ff-a7f9-83fc45d05454-metallb-excludel2\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.943051 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-metrics-certs\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.944318 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-metrics-certs\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.955951 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7941f29f-2aed-45e8-b9c3-d4e0c19573ee-cert\") pod \"frr-k8s-webhook-server-7f989f654f-mqpjj\" (UID: \"7941f29f-2aed-45e8-b9c3-d4e0c19573ee\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.959593 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmp92\" (UniqueName: \"kubernetes.io/projected/b39d0cb4-408c-4af7-a2e1-e3611a3eb09e-kube-api-access-hmp92\") pod \"frr-k8s-xrrnq\" (UID: \"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e\") " pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.978526 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jwp5\" (UniqueName: \"kubernetes.io/projected/a7841ae2-e705-49ff-a7f9-83fc45d05454-kube-api-access-4jwp5\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.979162 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q444\" (UniqueName: \"kubernetes.io/projected/7941f29f-2aed-45e8-b9c3-d4e0c19573ee-kube-api-access-9q444\") pod \"frr-k8s-webhook-server-7f989f654f-mqpjj\" (UID: \"7941f29f-2aed-45e8-b9c3-d4e0c19573ee\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.979508 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:34:54 crc kubenswrapper[4703]: I0309 13:34:54.992019 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.038160 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-cert\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.038248 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49bk\" (UniqueName: \"kubernetes.io/projected/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-kube-api-access-g49bk\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.038434 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-metrics-certs\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.054330 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-cert\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.067376 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49bk\" (UniqueName: \"kubernetes.io/projected/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-kube-api-access-g49bk\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.070390 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b3ec15b-631d-4ea5-b1f2-899a5d44785d-metrics-certs\") pod \"controller-86ddb6bd46-htlzv\" (UID: \"1b3ec15b-631d-4ea5-b1f2-899a5d44785d\") " pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.077506 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.444097 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:55 crc kubenswrapper[4703]: E0309 13:34:55.444385 4703 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 13:34:55 crc kubenswrapper[4703]: E0309 13:34:55.444599 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist podName:a7841ae2-e705-49ff-a7f9-83fc45d05454 nodeName:}" failed. No retries permitted until 2026-03-09 13:34:56.444575802 +0000 UTC m=+892.411991498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist") pod "speaker-qlmqf" (UID: "a7841ae2-e705-49ff-a7f9-83fc45d05454") : secret "metallb-memberlist" not found Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.447239 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerStarted","Data":"e24e318b3afe560492c63ecb61b7042e181da5a52813d383f72b11cdfad06e3a"} Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.481012 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj"] Mar 09 13:34:55 crc kubenswrapper[4703]: W0309 13:34:55.488596 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7941f29f_2aed_45e8_b9c3_d4e0c19573ee.slice/crio-0b3045e9d1dcf109822ab7f6e606445872e4a86ccb3d51ebbdaab314dfc6d1c9 WatchSource:0}: Error finding container 0b3045e9d1dcf109822ab7f6e606445872e4a86ccb3d51ebbdaab314dfc6d1c9: Status 404 returned error can't find the container with id 0b3045e9d1dcf109822ab7f6e606445872e4a86ccb3d51ebbdaab314dfc6d1c9 Mar 09 13:34:55 crc kubenswrapper[4703]: I0309 13:34:55.529025 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-htlzv"] Mar 09 13:34:55 crc kubenswrapper[4703]: W0309 13:34:55.538742 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3ec15b_631d_4ea5_b1f2_899a5d44785d.slice/crio-9818bdbb06bb591f4a6285b8fe3eb13a6545270c880f6bd08bad8207f382d27d WatchSource:0}: Error finding container 9818bdbb06bb591f4a6285b8fe3eb13a6545270c880f6bd08bad8207f382d27d: Status 404 returned error can't find the container with id 9818bdbb06bb591f4a6285b8fe3eb13a6545270c880f6bd08bad8207f382d27d Mar 09 13:34:56 crc kubenswrapper[4703]: I0309 13:34:56.455810 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" event={"ID":"7941f29f-2aed-45e8-b9c3-d4e0c19573ee","Type":"ContainerStarted","Data":"0b3045e9d1dcf109822ab7f6e606445872e4a86ccb3d51ebbdaab314dfc6d1c9"} Mar 09 13:34:56 crc kubenswrapper[4703]: I0309 13:34:56.458306 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-htlzv" event={"ID":"1b3ec15b-631d-4ea5-b1f2-899a5d44785d","Type":"ContainerStarted","Data":"3d8b9585a2de50c3726be3efcae14f025b9eed890698929f5a8c6e3574ae57b9"} Mar 09 13:34:56 crc kubenswrapper[4703]: I0309 13:34:56.458371 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-htlzv" event={"ID":"1b3ec15b-631d-4ea5-b1f2-899a5d44785d","Type":"ContainerStarted","Data":"9818bdbb06bb591f4a6285b8fe3eb13a6545270c880f6bd08bad8207f382d27d"} Mar 09 13:34:56 crc kubenswrapper[4703]: I0309 13:34:56.460179 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:56 crc kubenswrapper[4703]: I0309 13:34:56.468916 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a7841ae2-e705-49ff-a7f9-83fc45d05454-memberlist\") pod \"speaker-qlmqf\" (UID: \"a7841ae2-e705-49ff-a7f9-83fc45d05454\") " pod="metallb-system/speaker-qlmqf" Mar 09 13:34:56 crc kubenswrapper[4703]: I0309 13:34:56.560942 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qlmqf" Mar 09 13:34:56 crc kubenswrapper[4703]: W0309 13:34:56.596688 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7841ae2_e705_49ff_a7f9_83fc45d05454.slice/crio-b4a7a5136b3b6aeb190c1611d2bc99b5f7b3feb4c49e5e4776490c42aa81adf5 WatchSource:0}: Error finding container b4a7a5136b3b6aeb190c1611d2bc99b5f7b3feb4c49e5e4776490c42aa81adf5: Status 404 returned error can't find the container with id b4a7a5136b3b6aeb190c1611d2bc99b5f7b3feb4c49e5e4776490c42aa81adf5 Mar 09 13:34:57 crc kubenswrapper[4703]: I0309 13:34:57.467831 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qlmqf" event={"ID":"a7841ae2-e705-49ff-a7f9-83fc45d05454","Type":"ContainerStarted","Data":"06193b88facea74006f9d7af7d02f26ba700ecbe7781e4668c227efde2f86f26"} Mar 09 13:34:57 crc kubenswrapper[4703]: I0309 13:34:57.467896 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qlmqf" event={"ID":"a7841ae2-e705-49ff-a7f9-83fc45d05454","Type":"ContainerStarted","Data":"b4a7a5136b3b6aeb190c1611d2bc99b5f7b3feb4c49e5e4776490c42aa81adf5"} Mar 09 13:35:00 crc kubenswrapper[4703]: I0309 13:35:00.501542 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-htlzv" event={"ID":"1b3ec15b-631d-4ea5-b1f2-899a5d44785d","Type":"ContainerStarted","Data":"81614d09b1229c43aa8c7c5a70acfbe25bbdc7c736f8fbd6cbdf8e63b6157b6f"} Mar 09 13:35:00 crc kubenswrapper[4703]: I0309 13:35:00.502086 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:35:00 crc kubenswrapper[4703]: I0309 13:35:00.503262 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qlmqf" event={"ID":"a7841ae2-e705-49ff-a7f9-83fc45d05454","Type":"ContainerStarted","Data":"c0acbeb70f5c5ad3b6e4872dfa93bdef494814bfa1d92312572f2299ed34734b"} Mar 09 13:35:00 crc kubenswrapper[4703]: I0309 13:35:00.503400 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qlmqf" Mar 09 13:35:00 crc kubenswrapper[4703]: I0309 13:35:00.518058 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-htlzv" podStartSLOduration=2.798998696 podStartE2EDuration="6.518044415s" podCreationTimestamp="2026-03-09 13:34:54 +0000 UTC" firstStartedPulling="2026-03-09 13:34:55.655986547 +0000 UTC m=+891.623402273" lastFinishedPulling="2026-03-09 13:34:59.375032306 +0000 UTC m=+895.342447992" observedRunningTime="2026-03-09 13:35:00.514977219 +0000 UTC m=+896.482392915" watchObservedRunningTime="2026-03-09 13:35:00.518044415 +0000 UTC m=+896.485460101" Mar 09 13:35:00 crc kubenswrapper[4703]: I0309 13:35:00.533997 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qlmqf" podStartSLOduration=3.846926115 podStartE2EDuration="6.533977367s" podCreationTimestamp="2026-03-09 13:34:54 +0000 UTC" firstStartedPulling="2026-03-09 13:34:56.859521583 +0000 UTC m=+892.826937269" lastFinishedPulling="2026-03-09 13:34:59.546572835 +0000 UTC m=+895.513988521" observedRunningTime="2026-03-09 13:35:00.5304865 +0000 UTC m=+896.497902196" watchObservedRunningTime="2026-03-09 13:35:00.533977367 +0000 UTC m=+896.501393053" Mar 09 13:35:03 crc kubenswrapper[4703]: I0309 13:35:03.541767 4703 generic.go:334] "Generic (PLEG): container finished" podID="b39d0cb4-408c-4af7-a2e1-e3611a3eb09e" containerID="064838bf5429b650b0e8dc81a523bee6233bb1a37fa3df34930ea8cf4699a543" exitCode=0 Mar 09 13:35:03 crc kubenswrapper[4703]: I0309 13:35:03.541900 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerDied","Data":"064838bf5429b650b0e8dc81a523bee6233bb1a37fa3df34930ea8cf4699a543"} Mar 09 13:35:03 crc kubenswrapper[4703]: I0309 13:35:03.544424 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" event={"ID":"7941f29f-2aed-45e8-b9c3-d4e0c19573ee","Type":"ContainerStarted","Data":"671f8808172bc6bda9ad52e41e447c9d30303713144dccf6c8b7ad46c538b1f4"} Mar 09 13:35:03 crc kubenswrapper[4703]: I0309 13:35:03.544756 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:35:03 crc kubenswrapper[4703]: I0309 13:35:03.616197 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" podStartSLOduration=2.336470553 podStartE2EDuration="9.61617414s" podCreationTimestamp="2026-03-09 13:34:54 +0000 UTC" firstStartedPulling="2026-03-09 13:34:55.491754931 +0000 UTC m=+891.459170627" lastFinishedPulling="2026-03-09 13:35:02.771458518 +0000 UTC m=+898.738874214" observedRunningTime="2026-03-09 13:35:03.615619955 +0000 UTC m=+899.583035681" watchObservedRunningTime="2026-03-09 13:35:03.61617414 +0000 UTC m=+899.583589846" Mar 09 13:35:04 crc kubenswrapper[4703]: I0309 13:35:04.554262 4703 generic.go:334] "Generic (PLEG): container finished" podID="b39d0cb4-408c-4af7-a2e1-e3611a3eb09e" containerID="3d19c89e80f8a58edc3748ad37487d7bd45cd5c5daddee3a937b63d957f82096" exitCode=0 Mar 09 13:35:04 crc kubenswrapper[4703]: I0309 13:35:04.555143 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerDied","Data":"3d19c89e80f8a58edc3748ad37487d7bd45cd5c5daddee3a937b63d957f82096"} Mar 09 13:35:05 crc kubenswrapper[4703]: I0309 13:35:05.087057 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-htlzv" Mar 09 13:35:05 crc kubenswrapper[4703]: I0309 13:35:05.564030 4703 generic.go:334] "Generic (PLEG): container finished" podID="b39d0cb4-408c-4af7-a2e1-e3611a3eb09e" containerID="0e1383949fe54b0988e0a0b41b18741073e8384a7a5ca1d0cf0e67a2dc6d1682" exitCode=0 Mar 09 13:35:05 crc kubenswrapper[4703]: I0309 13:35:05.564084 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerDied","Data":"0e1383949fe54b0988e0a0b41b18741073e8384a7a5ca1d0cf0e67a2dc6d1682"} Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.564527 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qlmqf" Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.574878 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerStarted","Data":"53e836448648622de9141702ff09c0fa89b008940b2d41b59b96ca13ef79f765"} Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.574935 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerStarted","Data":"f2e873108ae5dcaf6816925a602425d1e0d8e3d432abd4cadff9ceb50c575740"} Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.574959 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerStarted","Data":"c574a7c2ebe0064096dcb6d4e418b939a043353f88a290632078aab198b30414"} Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.574978 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerStarted","Data":"789002b5b58819dc23077c07d9a2d95cdd7fa9f47f52420a3c63eca55523b645"} Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.574994 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerStarted","Data":"a8057297d15099ecac7a3b78fe3d0f0127604c42ac4a29780d4b3f3271a20dc7"} Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.575010 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xrrnq" event={"ID":"b39d0cb4-408c-4af7-a2e1-e3611a3eb09e","Type":"ContainerStarted","Data":"e8324df790efdae9d3a01e253e7d62edfbd53ecd3d34584908a1d51b7cb0628e"} Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.575035 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:35:06 crc kubenswrapper[4703]: I0309 13:35:06.613598 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xrrnq" podStartSLOduration=5.019149144 podStartE2EDuration="12.613581022s" podCreationTimestamp="2026-03-09 13:34:54 +0000 UTC" firstStartedPulling="2026-03-09 13:34:55.19433449 +0000 UTC m=+891.161750176" lastFinishedPulling="2026-03-09 13:35:02.788766328 +0000 UTC m=+898.756182054" observedRunningTime="2026-03-09 13:35:06.608803209 +0000 UTC m=+902.576218925" watchObservedRunningTime="2026-03-09 13:35:06.613581022 +0000 UTC m=+902.580996718" Mar 09 13:35:09 crc kubenswrapper[4703]: I0309 13:35:09.992763 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:35:10 crc kubenswrapper[4703]: I0309 13:35:10.038606 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:35:14 crc kubenswrapper[4703]: I0309 13:35:14.993279 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-mqpjj" Mar 09 13:35:15 crc kubenswrapper[4703]: I0309 13:35:15.789989 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-4m97d"] Mar 09 13:35:15 crc kubenswrapper[4703]: I0309 13:35:15.790769 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:15 crc kubenswrapper[4703]: I0309 13:35:15.792873 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 13:35:15 crc kubenswrapper[4703]: I0309 13:35:15.793752 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 13:35:15 crc kubenswrapper[4703]: I0309 13:35:15.796589 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-46wck" Mar 09 13:35:15 crc kubenswrapper[4703]: I0309 13:35:15.813329 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-4m97d"] Mar 09 13:35:15 crc kubenswrapper[4703]: I0309 13:35:15.981326 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gk8\" (UniqueName: \"kubernetes.io/projected/501adaf2-d65b-4e4f-97f5-aefac5df2d6f-kube-api-access-c7gk8\") pod \"mariadb-operator-index-4m97d\" (UID: \"501adaf2-d65b-4e4f-97f5-aefac5df2d6f\") " pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:16 crc kubenswrapper[4703]: I0309 13:35:16.082706 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gk8\" (UniqueName: \"kubernetes.io/projected/501adaf2-d65b-4e4f-97f5-aefac5df2d6f-kube-api-access-c7gk8\") pod \"mariadb-operator-index-4m97d\" (UID: \"501adaf2-d65b-4e4f-97f5-aefac5df2d6f\") " pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:16 crc kubenswrapper[4703]: I0309 13:35:16.115572 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gk8\" (UniqueName: \"kubernetes.io/projected/501adaf2-d65b-4e4f-97f5-aefac5df2d6f-kube-api-access-c7gk8\") pod \"mariadb-operator-index-4m97d\" (UID: \"501adaf2-d65b-4e4f-97f5-aefac5df2d6f\") " pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:16 crc kubenswrapper[4703]: I0309 13:35:16.126136 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:16 crc kubenswrapper[4703]: I0309 13:35:16.613374 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-4m97d"] Mar 09 13:35:16 crc kubenswrapper[4703]: W0309 13:35:16.626055 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod501adaf2_d65b_4e4f_97f5_aefac5df2d6f.slice/crio-e0227cc3d9a0cdd126e3c8f281a92e8a3f2bcc39e02d920db8ac5d90d026ece5 WatchSource:0}: Error finding container e0227cc3d9a0cdd126e3c8f281a92e8a3f2bcc39e02d920db8ac5d90d026ece5: Status 404 returned error can't find the container with id e0227cc3d9a0cdd126e3c8f281a92e8a3f2bcc39e02d920db8ac5d90d026ece5 Mar 09 13:35:16 crc kubenswrapper[4703]: I0309 13:35:16.628142 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:35:16 crc kubenswrapper[4703]: I0309 13:35:16.647960 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4m97d" event={"ID":"501adaf2-d65b-4e4f-97f5-aefac5df2d6f","Type":"ContainerStarted","Data":"e0227cc3d9a0cdd126e3c8f281a92e8a3f2bcc39e02d920db8ac5d90d026ece5"} Mar 09 13:35:18 crc kubenswrapper[4703]: I0309 13:35:18.663455 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4m97d" event={"ID":"501adaf2-d65b-4e4f-97f5-aefac5df2d6f","Type":"ContainerStarted","Data":"2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812"} Mar 09 13:35:18 crc kubenswrapper[4703]: I0309 13:35:18.686112 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-4m97d" podStartSLOduration=2.6776463059999998 podStartE2EDuration="3.686083469s" podCreationTimestamp="2026-03-09 13:35:15 +0000 UTC" firstStartedPulling="2026-03-09 13:35:16.627964361 +0000 UTC m=+912.595380037" lastFinishedPulling="2026-03-09 13:35:17.636401494 +0000 UTC m=+913.603817200" observedRunningTime="2026-03-09 13:35:18.683545981 +0000 UTC m=+914.650961717" watchObservedRunningTime="2026-03-09 13:35:18.686083469 +0000 UTC m=+914.653499195" Mar 09 13:35:25 crc kubenswrapper[4703]: I0309 13:35:25.017748 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xrrnq" Mar 09 13:35:26 crc kubenswrapper[4703]: I0309 13:35:26.127011 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:26 crc kubenswrapper[4703]: I0309 13:35:26.127101 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:26 crc kubenswrapper[4703]: I0309 13:35:26.170126 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:26 crc kubenswrapper[4703]: I0309 13:35:26.754575 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.076365 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6"] Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.077330 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.080621 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cxl8l" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.089977 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6"] Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.179631 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.179928 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.180023 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmmm\" (UniqueName: \"kubernetes.io/projected/67047f4d-9af8-4bbe-841e-a0f0260951d1-kube-api-access-gxmmm\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.281324 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmmm\" (UniqueName: \"kubernetes.io/projected/67047f4d-9af8-4bbe-841e-a0f0260951d1-kube-api-access-gxmmm\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.281475 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.281587 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.282341 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-util\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.282568 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-bundle\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.307489 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmmm\" (UniqueName: \"kubernetes.io/projected/67047f4d-9af8-4bbe-841e-a0f0260951d1-kube-api-access-gxmmm\") pod \"b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.390775 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.655536 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6"] Mar 09 13:35:28 crc kubenswrapper[4703]: I0309 13:35:28.733137 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" event={"ID":"67047f4d-9af8-4bbe-841e-a0f0260951d1","Type":"ContainerStarted","Data":"add0621d0976ddea8fd7c452fb78662343ebd6335cb833504565c5815dcae248"} Mar 09 13:35:29 crc kubenswrapper[4703]: I0309 13:35:29.744962 4703 generic.go:334] "Generic (PLEG): container finished" podID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerID="74867807e231a988cf8ff5f2097f91b1ae6c1cd31322635c4495d834a3894291" exitCode=0 Mar 09 13:35:29 crc kubenswrapper[4703]: I0309 13:35:29.745037 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" event={"ID":"67047f4d-9af8-4bbe-841e-a0f0260951d1","Type":"ContainerDied","Data":"74867807e231a988cf8ff5f2097f91b1ae6c1cd31322635c4495d834a3894291"} Mar 09 13:35:31 crc kubenswrapper[4703]: I0309 13:35:31.766491 4703 generic.go:334] "Generic (PLEG): container finished" podID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerID="e343ed8ae902d506b5f08c379658c23e46e43d4c2e7f091d3685a50da8ed24af" exitCode=0 Mar 09 13:35:31 crc kubenswrapper[4703]: I0309 13:35:31.766611 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" event={"ID":"67047f4d-9af8-4bbe-841e-a0f0260951d1","Type":"ContainerDied","Data":"e343ed8ae902d506b5f08c379658c23e46e43d4c2e7f091d3685a50da8ed24af"} Mar 09 13:35:32 crc kubenswrapper[4703]: I0309 13:35:32.777718 4703 generic.go:334] "Generic (PLEG): container finished" podID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerID="fef669e8097a182022f24fc2b549ca6651eb824d7f6d6c46f0609b95efaa85ce" exitCode=0 Mar 09 13:35:32 crc kubenswrapper[4703]: I0309 13:35:32.777797 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" event={"ID":"67047f4d-9af8-4bbe-841e-a0f0260951d1","Type":"ContainerDied","Data":"fef669e8097a182022f24fc2b549ca6651eb824d7f6d6c46f0609b95efaa85ce"} Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.131994 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.182406 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-util\") pod \"67047f4d-9af8-4bbe-841e-a0f0260951d1\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.182509 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmmm\" (UniqueName: \"kubernetes.io/projected/67047f4d-9af8-4bbe-841e-a0f0260951d1-kube-api-access-gxmmm\") pod \"67047f4d-9af8-4bbe-841e-a0f0260951d1\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.182634 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-bundle\") pod \"67047f4d-9af8-4bbe-841e-a0f0260951d1\" (UID: \"67047f4d-9af8-4bbe-841e-a0f0260951d1\") " Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.184478 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-bundle" (OuterVolumeSpecName: "bundle") pod "67047f4d-9af8-4bbe-841e-a0f0260951d1" (UID: "67047f4d-9af8-4bbe-841e-a0f0260951d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.190151 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67047f4d-9af8-4bbe-841e-a0f0260951d1-kube-api-access-gxmmm" (OuterVolumeSpecName: "kube-api-access-gxmmm") pod "67047f4d-9af8-4bbe-841e-a0f0260951d1" (UID: "67047f4d-9af8-4bbe-841e-a0f0260951d1"). InnerVolumeSpecName "kube-api-access-gxmmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.284723 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmmm\" (UniqueName: \"kubernetes.io/projected/67047f4d-9af8-4bbe-841e-a0f0260951d1-kube-api-access-gxmmm\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.284763 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.405581 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-util" (OuterVolumeSpecName: "util") pod "67047f4d-9af8-4bbe-841e-a0f0260951d1" (UID: "67047f4d-9af8-4bbe-841e-a0f0260951d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.487546 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67047f4d-9af8-4bbe-841e-a0f0260951d1-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.794966 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" event={"ID":"67047f4d-9af8-4bbe-841e-a0f0260951d1","Type":"ContainerDied","Data":"add0621d0976ddea8fd7c452fb78662343ebd6335cb833504565c5815dcae248"} Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.795028 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add0621d0976ddea8fd7c452fb78662343ebd6335cb833504565c5815dcae248" Mar 09 13:35:34 crc kubenswrapper[4703]: I0309 13:35:34.795144 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.783023 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5"] Mar 09 13:35:38 crc kubenswrapper[4703]: E0309 13:35:38.783799 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerName="extract" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.783813 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerName="extract" Mar 09 13:35:38 crc kubenswrapper[4703]: E0309 13:35:38.783828 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerName="util" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.783836 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerName="util" Mar 09 13:35:38 crc kubenswrapper[4703]: E0309 13:35:38.783871 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerName="pull" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.783879 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerName="pull" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.783992 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" containerName="extract" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.784486 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.786127 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.786308 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kcpnk" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.786600 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.800897 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5"] Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.845785 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbnbl\" (UniqueName: \"kubernetes.io/projected/5be038e1-1768-4ebe-9ec8-d86cb2173e75-kube-api-access-pbnbl\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.845926 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-apiservice-cert\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.845963 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-webhook-cert\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.946886 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-apiservice-cert\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.947140 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-webhook-cert\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.947336 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbnbl\" (UniqueName: \"kubernetes.io/projected/5be038e1-1768-4ebe-9ec8-d86cb2173e75-kube-api-access-pbnbl\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.952983 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-webhook-cert\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.959491 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-apiservice-cert\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:38 crc kubenswrapper[4703]: I0309 13:35:38.963336 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbnbl\" (UniqueName: \"kubernetes.io/projected/5be038e1-1768-4ebe-9ec8-d86cb2173e75-kube-api-access-pbnbl\") pod \"mariadb-operator-controller-manager-94c4fbb79-v7dq5\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:39 crc kubenswrapper[4703]: I0309 13:35:39.102937 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:39 crc kubenswrapper[4703]: I0309 13:35:39.541090 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5"] Mar 09 13:35:39 crc kubenswrapper[4703]: W0309 13:35:39.552523 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be038e1_1768_4ebe_9ec8_d86cb2173e75.slice/crio-e8ec7f71044982404f741e6ab6bddceba3fb2adb195983ac1c73342fce126da8 WatchSource:0}: Error finding container e8ec7f71044982404f741e6ab6bddceba3fb2adb195983ac1c73342fce126da8: Status 404 returned error can't find the container with id e8ec7f71044982404f741e6ab6bddceba3fb2adb195983ac1c73342fce126da8 Mar 09 13:35:39 crc kubenswrapper[4703]: I0309 13:35:39.823378 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" event={"ID":"5be038e1-1768-4ebe-9ec8-d86cb2173e75","Type":"ContainerStarted","Data":"e8ec7f71044982404f741e6ab6bddceba3fb2adb195983ac1c73342fce126da8"} Mar 09 13:35:43 crc kubenswrapper[4703]: I0309 13:35:43.847802 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" event={"ID":"5be038e1-1768-4ebe-9ec8-d86cb2173e75","Type":"ContainerStarted","Data":"3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4"} Mar 09 13:35:43 crc kubenswrapper[4703]: I0309 13:35:43.848406 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:43 crc kubenswrapper[4703]: I0309 13:35:43.869959 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" podStartSLOduration=2.636209193 podStartE2EDuration="5.869931051s" podCreationTimestamp="2026-03-09 13:35:38 +0000 UTC" firstStartedPulling="2026-03-09 13:35:39.555862231 +0000 UTC m=+935.523277937" lastFinishedPulling="2026-03-09 13:35:42.789584109 +0000 UTC m=+938.756999795" observedRunningTime="2026-03-09 13:35:43.86655111 +0000 UTC m=+939.833966806" watchObservedRunningTime="2026-03-09 13:35:43.869931051 +0000 UTC m=+939.837346747" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.185865 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vgzzz"] Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.187045 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.241479 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgzzz"] Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.244736 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-utilities\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.244807 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-catalog-content\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.245093 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5crf9\" (UniqueName: \"kubernetes.io/projected/3ac99cdd-0b59-4696-beaf-b878380104db-kube-api-access-5crf9\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.346888 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5crf9\" (UniqueName: \"kubernetes.io/projected/3ac99cdd-0b59-4696-beaf-b878380104db-kube-api-access-5crf9\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.347152 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-utilities\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.347271 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-catalog-content\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.347944 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-utilities\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.348056 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-catalog-content\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.376854 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5crf9\" (UniqueName: \"kubernetes.io/projected/3ac99cdd-0b59-4696-beaf-b878380104db-kube-api-access-5crf9\") pod \"redhat-marketplace-vgzzz\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.502534 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:44 crc kubenswrapper[4703]: I0309 13:35:44.902746 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgzzz"] Mar 09 13:35:44 crc kubenswrapper[4703]: W0309 13:35:44.904609 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac99cdd_0b59_4696_beaf_b878380104db.slice/crio-b7186c87e38456f8fbb06584bc2581f3706471f88374073ba07be067410a9710 WatchSource:0}: Error finding container b7186c87e38456f8fbb06584bc2581f3706471f88374073ba07be067410a9710: Status 404 returned error can't find the container with id b7186c87e38456f8fbb06584bc2581f3706471f88374073ba07be067410a9710 Mar 09 13:35:45 crc kubenswrapper[4703]: I0309 13:35:45.859885 4703 generic.go:334] "Generic (PLEG): container finished" podID="3ac99cdd-0b59-4696-beaf-b878380104db" containerID="c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc" exitCode=0 Mar 09 13:35:45 crc kubenswrapper[4703]: I0309 13:35:45.859938 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgzzz" event={"ID":"3ac99cdd-0b59-4696-beaf-b878380104db","Type":"ContainerDied","Data":"c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc"} Mar 09 13:35:45 crc kubenswrapper[4703]: I0309 13:35:45.860247 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgzzz" event={"ID":"3ac99cdd-0b59-4696-beaf-b878380104db","Type":"ContainerStarted","Data":"b7186c87e38456f8fbb06584bc2581f3706471f88374073ba07be067410a9710"} Mar 09 13:35:46 crc kubenswrapper[4703]: I0309 13:35:46.867354 4703 generic.go:334] "Generic (PLEG): container finished" podID="3ac99cdd-0b59-4696-beaf-b878380104db" containerID="0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00" exitCode=0 Mar 09 13:35:46 crc kubenswrapper[4703]: I0309 13:35:46.867426 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgzzz" event={"ID":"3ac99cdd-0b59-4696-beaf-b878380104db","Type":"ContainerDied","Data":"0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00"} Mar 09 13:35:47 crc kubenswrapper[4703]: I0309 13:35:47.876270 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgzzz" event={"ID":"3ac99cdd-0b59-4696-beaf-b878380104db","Type":"ContainerStarted","Data":"b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859"} Mar 09 13:35:47 crc kubenswrapper[4703]: I0309 13:35:47.894794 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vgzzz" podStartSLOduration=2.40740702 podStartE2EDuration="3.894773054s" podCreationTimestamp="2026-03-09 13:35:44 +0000 UTC" firstStartedPulling="2026-03-09 13:35:45.861810353 +0000 UTC m=+941.829226049" lastFinishedPulling="2026-03-09 13:35:47.349176397 +0000 UTC m=+943.316592083" observedRunningTime="2026-03-09 13:35:47.890955181 +0000 UTC m=+943.858370907" watchObservedRunningTime="2026-03-09 13:35:47.894773054 +0000 UTC m=+943.862188780" Mar 09 13:35:49 crc kubenswrapper[4703]: I0309 13:35:49.109324 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.503338 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.503917 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.585505 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.798628 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-tzq9d"] Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.799992 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.804170 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-mms47" Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.815995 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-tzq9d"] Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.881143 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfcrc\" (UniqueName: \"kubernetes.io/projected/aadbd6bd-930e-473a-851e-820fbfcd22db-kube-api-access-xfcrc\") pod \"infra-operator-index-tzq9d\" (UID: \"aadbd6bd-930e-473a-851e-820fbfcd22db\") " pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:35:54 crc kubenswrapper[4703]: I0309 13:35:54.981929 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfcrc\" (UniqueName: \"kubernetes.io/projected/aadbd6bd-930e-473a-851e-820fbfcd22db-kube-api-access-xfcrc\") pod \"infra-operator-index-tzq9d\" (UID: \"aadbd6bd-930e-473a-851e-820fbfcd22db\") " pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:35:55 crc kubenswrapper[4703]: I0309 13:35:55.007128 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfcrc\" (UniqueName: \"kubernetes.io/projected/aadbd6bd-930e-473a-851e-820fbfcd22db-kube-api-access-xfcrc\") pod \"infra-operator-index-tzq9d\" (UID: \"aadbd6bd-930e-473a-851e-820fbfcd22db\") " pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:35:55 crc kubenswrapper[4703]: I0309 13:35:55.030406 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:55 crc kubenswrapper[4703]: I0309 13:35:55.137436 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:35:55 crc kubenswrapper[4703]: I0309 13:35:55.423997 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-tzq9d"] Mar 09 13:35:55 crc kubenswrapper[4703]: I0309 13:35:55.946712 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-tzq9d" event={"ID":"aadbd6bd-930e-473a-851e-820fbfcd22db","Type":"ContainerStarted","Data":"bb3c73b04e101c39ae86290cd4a10eb8598f1dcfec26fd98914bdfafb46b5c62"} Mar 09 13:35:56 crc kubenswrapper[4703]: I0309 13:35:56.957465 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-tzq9d" event={"ID":"aadbd6bd-930e-473a-851e-820fbfcd22db","Type":"ContainerStarted","Data":"3ede56d7dcf6cefa860b43906029fcb5c3f1f6cb99775abcedf9d911b05fe2d0"} Mar 09 13:35:56 crc kubenswrapper[4703]: I0309 13:35:56.983373 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-tzq9d" podStartSLOduration=2.070905787 podStartE2EDuration="2.983348763s" podCreationTimestamp="2026-03-09 13:35:54 +0000 UTC" firstStartedPulling="2026-03-09 13:35:55.434908384 +0000 UTC m=+951.402324090" lastFinishedPulling="2026-03-09 13:35:56.34735134 +0000 UTC m=+952.314767066" observedRunningTime="2026-03-09 13:35:56.980420654 +0000 UTC m=+952.947836390" watchObservedRunningTime="2026-03-09 13:35:56.983348763 +0000 UTC m=+952.950764479" Mar 09 13:35:58 crc kubenswrapper[4703]: I0309 13:35:58.985395 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgzzz"] Mar 09 13:35:58 crc kubenswrapper[4703]: I0309 13:35:58.986221 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vgzzz" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="registry-server" containerID="cri-o://b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859" gracePeriod=2 Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.414022 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.539523 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-utilities\") pod \"3ac99cdd-0b59-4696-beaf-b878380104db\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.539586 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5crf9\" (UniqueName: \"kubernetes.io/projected/3ac99cdd-0b59-4696-beaf-b878380104db-kube-api-access-5crf9\") pod \"3ac99cdd-0b59-4696-beaf-b878380104db\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.539617 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-catalog-content\") pod \"3ac99cdd-0b59-4696-beaf-b878380104db\" (UID: \"3ac99cdd-0b59-4696-beaf-b878380104db\") " Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.540596 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-utilities" (OuterVolumeSpecName: "utilities") pod "3ac99cdd-0b59-4696-beaf-b878380104db" (UID: "3ac99cdd-0b59-4696-beaf-b878380104db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.545383 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac99cdd-0b59-4696-beaf-b878380104db-kube-api-access-5crf9" (OuterVolumeSpecName: "kube-api-access-5crf9") pod "3ac99cdd-0b59-4696-beaf-b878380104db" (UID: "3ac99cdd-0b59-4696-beaf-b878380104db"). InnerVolumeSpecName "kube-api-access-5crf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.575120 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ac99cdd-0b59-4696-beaf-b878380104db" (UID: "3ac99cdd-0b59-4696-beaf-b878380104db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.640742 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.640781 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5crf9\" (UniqueName: \"kubernetes.io/projected/3ac99cdd-0b59-4696-beaf-b878380104db-kube-api-access-5crf9\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.640794 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac99cdd-0b59-4696-beaf-b878380104db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.999872 4703 generic.go:334] "Generic (PLEG): container finished" podID="3ac99cdd-0b59-4696-beaf-b878380104db" containerID="b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859" exitCode=0 Mar 09 13:35:59 crc kubenswrapper[4703]: I0309 13:35:59.999922 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgzzz" event={"ID":"3ac99cdd-0b59-4696-beaf-b878380104db","Type":"ContainerDied","Data":"b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859"} Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:35:59.999947 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgzzz" event={"ID":"3ac99cdd-0b59-4696-beaf-b878380104db","Type":"ContainerDied","Data":"b7186c87e38456f8fbb06584bc2581f3706471f88374073ba07be067410a9710"} Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:35:59.999965 4703 scope.go:117] "RemoveContainer" containerID="b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:35:59.999989 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgzzz" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.023414 4703 scope.go:117] "RemoveContainer" containerID="0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.040393 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgzzz"] Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.051545 4703 scope.go:117] "RemoveContainer" containerID="c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.054820 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgzzz"] Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.069458 4703 scope.go:117] "RemoveContainer" containerID="b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859" Mar 09 13:36:00 crc kubenswrapper[4703]: E0309 13:36:00.069831 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859\": container with ID starting with b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859 not found: ID does not exist" containerID="b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.069905 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859"} err="failed to get container status \"b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859\": rpc error: code = NotFound desc = could not find container \"b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859\": container with ID starting with b0860c7fc73ec1eb9a8b87c516af7b86177543e6d21dc7384d9bea30d5337859 not found: ID does not exist" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.069938 4703 scope.go:117] "RemoveContainer" containerID="0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00" Mar 09 13:36:00 crc kubenswrapper[4703]: E0309 13:36:00.070312 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00\": container with ID starting with 0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00 not found: ID does not exist" containerID="0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.070338 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00"} err="failed to get container status \"0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00\": rpc error: code = NotFound desc = could not find container \"0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00\": container with ID starting with 0f9a2cbc75f26f73731b3795e0e9e0897283c7f7338d0b58e1c2f87e5e3e3f00 not found: ID does not exist" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.070353 4703 scope.go:117] "RemoveContainer" containerID="c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc" Mar 09 13:36:00 crc kubenswrapper[4703]: E0309 13:36:00.070545 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc\": container with ID starting with c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc not found: ID does not exist" containerID="c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.070571 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc"} err="failed to get container status \"c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc\": rpc error: code = NotFound desc = could not find container \"c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc\": container with ID starting with c057de500a489b0f4f0cc30f0fa0036893f5e90b2d21c330cb62494092e43bfc not found: ID does not exist" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.145366 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551056-wc7wk"] Mar 09 13:36:00 crc kubenswrapper[4703]: E0309 13:36:00.145592 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="registry-server" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.145611 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="registry-server" Mar 09 13:36:00 crc kubenswrapper[4703]: E0309 13:36:00.145620 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="extract-content" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.145629 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="extract-content" Mar 09 13:36:00 crc kubenswrapper[4703]: E0309 13:36:00.145643 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="extract-utilities" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.145649 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="extract-utilities" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.145745 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" containerName="registry-server" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.146111 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-wc7wk" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.147825 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.150059 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.150470 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.154106 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-wc7wk"] Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.154106 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmz9\" (UniqueName: \"kubernetes.io/projected/c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e-kube-api-access-5cmz9\") pod \"auto-csr-approver-29551056-wc7wk\" (UID: \"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e\") " pod="openshift-infra/auto-csr-approver-29551056-wc7wk" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.255130 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmz9\" (UniqueName: \"kubernetes.io/projected/c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e-kube-api-access-5cmz9\") pod \"auto-csr-approver-29551056-wc7wk\" (UID: \"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e\") " pod="openshift-infra/auto-csr-approver-29551056-wc7wk" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.271332 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmz9\" (UniqueName: \"kubernetes.io/projected/c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e-kube-api-access-5cmz9\") pod \"auto-csr-approver-29551056-wc7wk\" (UID: \"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e\") " pod="openshift-infra/auto-csr-approver-29551056-wc7wk" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.460342 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-wc7wk" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.721022 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac99cdd-0b59-4696-beaf-b878380104db" path="/var/lib/kubelet/pods/3ac99cdd-0b59-4696-beaf-b878380104db/volumes" Mar 09 13:36:00 crc kubenswrapper[4703]: I0309 13:36:00.900280 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-wc7wk"] Mar 09 13:36:00 crc kubenswrapper[4703]: W0309 13:36:00.904545 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8eb7ded_2d2b_4610_bd30_2b9fd1a6d57e.slice/crio-a69ca0536a00c0863a2534857da919744ee878480468aa7d540e754a73498cad WatchSource:0}: Error finding container a69ca0536a00c0863a2534857da919744ee878480468aa7d540e754a73498cad: Status 404 returned error can't find the container with id a69ca0536a00c0863a2534857da919744ee878480468aa7d540e754a73498cad Mar 09 13:36:01 crc kubenswrapper[4703]: I0309 13:36:01.010768 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-wc7wk" event={"ID":"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e","Type":"ContainerStarted","Data":"a69ca0536a00c0863a2534857da919744ee878480468aa7d540e754a73498cad"} Mar 09 13:36:03 crc kubenswrapper[4703]: I0309 13:36:03.026995 4703 generic.go:334] "Generic (PLEG): container finished" podID="c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e" containerID="c74e409538490bf48b93f38832cde9297064709b7b4c29b68d8aaca4d57b6e80" exitCode=0 Mar 09 13:36:03 crc kubenswrapper[4703]: I0309 13:36:03.027095 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-wc7wk" event={"ID":"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e","Type":"ContainerDied","Data":"c74e409538490bf48b93f38832cde9297064709b7b4c29b68d8aaca4d57b6e80"} Mar 09 13:36:04 crc kubenswrapper[4703]: I0309 13:36:04.344305 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-wc7wk" Mar 09 13:36:04 crc kubenswrapper[4703]: I0309 13:36:04.413469 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cmz9\" (UniqueName: \"kubernetes.io/projected/c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e-kube-api-access-5cmz9\") pod \"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e\" (UID: \"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e\") " Mar 09 13:36:04 crc kubenswrapper[4703]: I0309 13:36:04.426116 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e-kube-api-access-5cmz9" (OuterVolumeSpecName: "kube-api-access-5cmz9") pod "c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e" (UID: "c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e"). InnerVolumeSpecName "kube-api-access-5cmz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:04 crc kubenswrapper[4703]: I0309 13:36:04.514672 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cmz9\" (UniqueName: \"kubernetes.io/projected/c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e-kube-api-access-5cmz9\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.046958 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551056-wc7wk" event={"ID":"c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e","Type":"ContainerDied","Data":"a69ca0536a00c0863a2534857da919744ee878480468aa7d540e754a73498cad"} Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.047016 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a69ca0536a00c0863a2534857da919744ee878480468aa7d540e754a73498cad" Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.047100 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551056-wc7wk" Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.138334 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.138375 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.178549 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.438318 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-w874x"] Mar 09 13:36:05 crc kubenswrapper[4703]: I0309 13:36:05.457758 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551050-w874x"] Mar 09 13:36:06 crc kubenswrapper[4703]: I0309 13:36:06.100714 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:36:06 crc kubenswrapper[4703]: I0309 13:36:06.721070 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b7269c-2186-4e45-bb2d-74ee25925050" path="/var/lib/kubelet/pods/09b7269c-2186-4e45-bb2d-74ee25925050/volumes" Mar 09 13:36:09 crc kubenswrapper[4703]: I0309 13:36:09.500393 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:36:09 crc kubenswrapper[4703]: I0309 13:36:09.500478 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.448033 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p"] Mar 09 13:36:13 crc kubenswrapper[4703]: E0309 13:36:13.448440 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e" containerName="oc" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.448462 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e" containerName="oc" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.448640 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e" containerName="oc" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.450200 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.454067 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cxl8l" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.466665 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p"] Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.643552 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.643897 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.644002 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4w45\" (UniqueName: \"kubernetes.io/projected/512fe7ca-3877-4ac3-bb40-870e62a89a04-kube-api-access-v4w45\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.745486 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.745942 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.746080 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.746308 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.746368 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4w45\" (UniqueName: \"kubernetes.io/projected/512fe7ca-3877-4ac3-bb40-870e62a89a04-kube-api-access-v4w45\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.765578 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4w45\" (UniqueName: \"kubernetes.io/projected/512fe7ca-3877-4ac3-bb40-870e62a89a04-kube-api-access-v4w45\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.777249 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:13 crc kubenswrapper[4703]: I0309 13:36:13.980152 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p"] Mar 09 13:36:13 crc kubenswrapper[4703]: W0309 13:36:13.982579 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod512fe7ca_3877_4ac3_bb40_870e62a89a04.slice/crio-bd91344497f08a222a14b5cb37c104914cc3bf45d4c2e3b098638343a3b48077 WatchSource:0}: Error finding container bd91344497f08a222a14b5cb37c104914cc3bf45d4c2e3b098638343a3b48077: Status 404 returned error can't find the container with id bd91344497f08a222a14b5cb37c104914cc3bf45d4c2e3b098638343a3b48077 Mar 09 13:36:14 crc kubenswrapper[4703]: I0309 13:36:14.113303 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" event={"ID":"512fe7ca-3877-4ac3-bb40-870e62a89a04","Type":"ContainerStarted","Data":"bd91344497f08a222a14b5cb37c104914cc3bf45d4c2e3b098638343a3b48077"} Mar 09 13:36:15 crc kubenswrapper[4703]: I0309 13:36:15.121278 4703 generic.go:334] "Generic (PLEG): container finished" podID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerID="0774a838ced41224ad9f3a8b9ace73875f3f26aa3d54b268fb8054f0d2176f97" exitCode=0 Mar 09 13:36:15 crc kubenswrapper[4703]: I0309 13:36:15.121319 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" event={"ID":"512fe7ca-3877-4ac3-bb40-870e62a89a04","Type":"ContainerDied","Data":"0774a838ced41224ad9f3a8b9ace73875f3f26aa3d54b268fb8054f0d2176f97"} Mar 09 13:36:16 crc kubenswrapper[4703]: I0309 13:36:16.141334 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" event={"ID":"512fe7ca-3877-4ac3-bb40-870e62a89a04","Type":"ContainerStarted","Data":"dd8b7bfcfb6ade8ba3bac6acd1811eafabf777b2ddb95e45644c6329c6651a5f"} Mar 09 13:36:17 crc kubenswrapper[4703]: I0309 13:36:17.151980 4703 generic.go:334] "Generic (PLEG): container finished" podID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerID="dd8b7bfcfb6ade8ba3bac6acd1811eafabf777b2ddb95e45644c6329c6651a5f" exitCode=0 Mar 09 13:36:17 crc kubenswrapper[4703]: I0309 13:36:17.152129 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" event={"ID":"512fe7ca-3877-4ac3-bb40-870e62a89a04","Type":"ContainerDied","Data":"dd8b7bfcfb6ade8ba3bac6acd1811eafabf777b2ddb95e45644c6329c6651a5f"} Mar 09 13:36:18 crc kubenswrapper[4703]: I0309 13:36:18.165825 4703 generic.go:334] "Generic (PLEG): container finished" podID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerID="019eaed7210329cbeff0c2e64196ddbc02792a17db10f821e0b1e2fb95bcd462" exitCode=0 Mar 09 13:36:18 crc kubenswrapper[4703]: I0309 13:36:18.165925 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" event={"ID":"512fe7ca-3877-4ac3-bb40-870e62a89a04","Type":"ContainerDied","Data":"019eaed7210329cbeff0c2e64196ddbc02792a17db10f821e0b1e2fb95bcd462"} Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.489475 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.637478 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4w45\" (UniqueName: \"kubernetes.io/projected/512fe7ca-3877-4ac3-bb40-870e62a89a04-kube-api-access-v4w45\") pod \"512fe7ca-3877-4ac3-bb40-870e62a89a04\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.637618 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-util\") pod \"512fe7ca-3877-4ac3-bb40-870e62a89a04\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.637780 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-bundle\") pod \"512fe7ca-3877-4ac3-bb40-870e62a89a04\" (UID: \"512fe7ca-3877-4ac3-bb40-870e62a89a04\") " Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.641082 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-bundle" (OuterVolumeSpecName: "bundle") pod "512fe7ca-3877-4ac3-bb40-870e62a89a04" (UID: "512fe7ca-3877-4ac3-bb40-870e62a89a04"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.645206 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512fe7ca-3877-4ac3-bb40-870e62a89a04-kube-api-access-v4w45" (OuterVolumeSpecName: "kube-api-access-v4w45") pod "512fe7ca-3877-4ac3-bb40-870e62a89a04" (UID: "512fe7ca-3877-4ac3-bb40-870e62a89a04"). InnerVolumeSpecName "kube-api-access-v4w45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.740070 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4w45\" (UniqueName: \"kubernetes.io/projected/512fe7ca-3877-4ac3-bb40-870e62a89a04-kube-api-access-v4w45\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.740129 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.889365 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-util" (OuterVolumeSpecName: "util") pod "512fe7ca-3877-4ac3-bb40-870e62a89a04" (UID: "512fe7ca-3877-4ac3-bb40-870e62a89a04"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:19 crc kubenswrapper[4703]: I0309 13:36:19.942877 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/512fe7ca-3877-4ac3-bb40-870e62a89a04-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:20 crc kubenswrapper[4703]: I0309 13:36:20.183524 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" event={"ID":"512fe7ca-3877-4ac3-bb40-870e62a89a04","Type":"ContainerDied","Data":"bd91344497f08a222a14b5cb37c104914cc3bf45d4c2e3b098638343a3b48077"} Mar 09 13:36:20 crc kubenswrapper[4703]: I0309 13:36:20.184549 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd91344497f08a222a14b5cb37c104914cc3bf45d4c2e3b098638343a3b48077" Mar 09 13:36:20 crc kubenswrapper[4703]: I0309 13:36:20.183631 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p" Mar 09 13:36:26 crc kubenswrapper[4703]: I0309 13:36:26.102084 4703 scope.go:117] "RemoveContainer" containerID="036724e1736742cdb81670ed160106d0c3f8be2570bc6b51cdbed1ae77f91a19" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.992745 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq"] Mar 09 13:36:30 crc kubenswrapper[4703]: E0309 13:36:30.993309 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerName="extract" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.993321 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerName="extract" Mar 09 13:36:30 crc kubenswrapper[4703]: E0309 13:36:30.993335 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerName="util" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.993341 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerName="util" Mar 09 13:36:30 crc kubenswrapper[4703]: E0309 13:36:30.993351 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerName="pull" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.993358 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerName="pull" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.993464 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" containerName="extract" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.993810 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.996543 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Mar 09 13:36:30 crc kubenswrapper[4703]: I0309 13:36:30.996934 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4m4cq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.012500 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq"] Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.093692 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-webhook-cert\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.093757 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6lbf\" (UniqueName: \"kubernetes.io/projected/b3ec581a-8cf6-40d6-abfa-347b39c624c2-kube-api-access-p6lbf\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.093805 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-apiservice-cert\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.195645 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-apiservice-cert\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.195766 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-webhook-cert\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.195786 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6lbf\" (UniqueName: \"kubernetes.io/projected/b3ec581a-8cf6-40d6-abfa-347b39c624c2-kube-api-access-p6lbf\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.201668 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-webhook-cert\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.201828 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-apiservice-cert\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.217491 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6lbf\" (UniqueName: \"kubernetes.io/projected/b3ec581a-8cf6-40d6-abfa-347b39c624c2-kube-api-access-p6lbf\") pod \"infra-operator-controller-manager-68c564b879-m5wgq\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.310194 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:31 crc kubenswrapper[4703]: I0309 13:36:31.744665 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq"] Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.266402 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" event={"ID":"b3ec581a-8cf6-40d6-abfa-347b39c624c2","Type":"ContainerStarted","Data":"952c413be6840ef2ae2f32812f4911f000526aa60facd8a7b8234637e0f2cfe4"} Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.391144 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cfxln"] Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.392517 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.406873 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfxln"] Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.413121 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf86b\" (UniqueName: \"kubernetes.io/projected/96332f87-9756-458e-881a-5e97b24c1591-kube-api-access-bf86b\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.413189 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-utilities\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.413274 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-catalog-content\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.514432 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-utilities\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.514521 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-catalog-content\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.514578 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf86b\" (UniqueName: \"kubernetes.io/projected/96332f87-9756-458e-881a-5e97b24c1591-kube-api-access-bf86b\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.515024 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-utilities\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.515235 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-catalog-content\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.552139 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf86b\" (UniqueName: \"kubernetes.io/projected/96332f87-9756-458e-881a-5e97b24c1591-kube-api-access-bf86b\") pod \"community-operators-cfxln\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:32 crc kubenswrapper[4703]: I0309 13:36:32.709693 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:33 crc kubenswrapper[4703]: I0309 13:36:33.104035 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfxln"] Mar 09 13:36:33 crc kubenswrapper[4703]: I0309 13:36:33.274897 4703 generic.go:334] "Generic (PLEG): container finished" podID="96332f87-9756-458e-881a-5e97b24c1591" containerID="881b6a5a154b88c6d43e67f198bb7192ffe6c5f814860b2c313a791a61401142" exitCode=0 Mar 09 13:36:33 crc kubenswrapper[4703]: I0309 13:36:33.274939 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfxln" event={"ID":"96332f87-9756-458e-881a-5e97b24c1591","Type":"ContainerDied","Data":"881b6a5a154b88c6d43e67f198bb7192ffe6c5f814860b2c313a791a61401142"} Mar 09 13:36:33 crc kubenswrapper[4703]: I0309 13:36:33.274964 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfxln" event={"ID":"96332f87-9756-458e-881a-5e97b24c1591","Type":"ContainerStarted","Data":"3e2f66ef7dae6f33bf1dd7611ddca690f222b033e1badbff8e246ebb34f58f54"} Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.650206 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/openstack-galera-0"] Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.652013 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.654221 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"manila-kuttl-tests"/"openstack-scripts" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.654821 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"manila-kuttl-tests"/"kube-root-ca.crt" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.655243 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"galera-openstack-dockercfg-thxn8" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.655873 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"manila-kuttl-tests"/"openshift-service-ca.crt" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.657623 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/openstack-galera-0"] Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.657971 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"manila-kuttl-tests"/"openstack-config-data" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.670933 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/openstack-galera-1"] Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.672236 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.687934 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/openstack-galera-2"] Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.689603 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.706108 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/openstack-galera-1"] Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.722790 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/openstack-galera-2"] Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843255 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843382 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66gn\" (UniqueName: \"kubernetes.io/projected/06f52029-696b-414e-a98e-0266b9f71c15-kube-api-access-b66gn\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843441 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kpcl\" (UniqueName: \"kubernetes.io/projected/03a80206-7d7f-42bc-b3ce-57abda992a6a-kube-api-access-7kpcl\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843473 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kolla-config\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843511 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-config-data-default\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843551 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843582 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843618 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-default\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843722 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-kolla-config\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843767 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.843800 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.844084 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06f52029-696b-414e-a98e-0266b9f71c15-config-data-generated\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.844162 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-operator-scripts\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.844192 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-generated\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.844292 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hhbh\" (UniqueName: \"kubernetes.io/projected/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kube-api-access-6hhbh\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.844335 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-default\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.844384 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-operator-scripts\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.844409 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-kolla-config\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.945431 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-kolla-config\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.945700 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.945809 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.945946 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06f52029-696b-414e-a98e-0266b9f71c15-config-data-generated\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946045 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-operator-scripts\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946114 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") device mount path \"/mnt/openstack/pv06\"" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946141 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-generated\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946449 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhbh\" (UniqueName: \"kubernetes.io/projected/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kube-api-access-6hhbh\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946525 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-operator-scripts\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946549 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-generated\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946572 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-kolla-config\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946607 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-default\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946622 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-kolla-config\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946648 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946744 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66gn\" (UniqueName: \"kubernetes.io/projected/06f52029-696b-414e-a98e-0266b9f71c15-kube-api-access-b66gn\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946792 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kpcl\" (UniqueName: \"kubernetes.io/projected/03a80206-7d7f-42bc-b3ce-57abda992a6a-kube-api-access-7kpcl\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946823 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kolla-config\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946903 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-config-data-default\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946965 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.947010 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.947047 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-default\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.947105 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06f52029-696b-414e-a98e-0266b9f71c15-config-data-generated\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.946793 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") device mount path \"/mnt/openstack/pv12\"" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.947891 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-config-data-default\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.948003 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-default\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.948004 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-kolla-config\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.948128 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") device mount path \"/mnt/openstack/pv08\"" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.948260 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-operator-scripts\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.948304 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.948439 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-default\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.948510 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kolla-config\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.951345 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-operator-scripts\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.952835 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.965989 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.968032 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.972917 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhbh\" (UniqueName: \"kubernetes.io/projected/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kube-api-access-6hhbh\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.974149 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kpcl\" (UniqueName: \"kubernetes.io/projected/03a80206-7d7f-42bc-b3ce-57abda992a6a-kube-api-access-7kpcl\") pod \"openstack-galera-2\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.977381 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66gn\" (UniqueName: \"kubernetes.io/projected/06f52029-696b-414e-a98e-0266b9f71c15-kube-api-access-b66gn\") pod \"openstack-galera-1\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.984080 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:34 crc kubenswrapper[4703]: I0309 13:36:34.990053 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.013455 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.017998 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.325610 4703 generic.go:334] "Generic (PLEG): container finished" podID="96332f87-9756-458e-881a-5e97b24c1591" containerID="0074998fe6250b3fda493a248c8e4a30666a2d0494885ccc1f767dabf62b8c26" exitCode=0 Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.325771 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfxln" event={"ID":"96332f87-9756-458e-881a-5e97b24c1591","Type":"ContainerDied","Data":"0074998fe6250b3fda493a248c8e4a30666a2d0494885ccc1f767dabf62b8c26"} Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.328076 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" event={"ID":"b3ec581a-8cf6-40d6-abfa-347b39c624c2","Type":"ContainerStarted","Data":"c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262"} Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.328454 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.378267 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" podStartSLOduration=2.865477697 podStartE2EDuration="5.378246781s" podCreationTimestamp="2026-03-09 13:36:30 +0000 UTC" firstStartedPulling="2026-03-09 13:36:31.761420876 +0000 UTC m=+987.728836572" lastFinishedPulling="2026-03-09 13:36:34.27418992 +0000 UTC m=+990.241605656" observedRunningTime="2026-03-09 13:36:35.374038247 +0000 UTC m=+991.341453933" watchObservedRunningTime="2026-03-09 13:36:35.378246781 +0000 UTC m=+991.345662467" Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.384992 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/openstack-galera-1"] Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.468744 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/openstack-galera-0"] Mar 09 13:36:35 crc kubenswrapper[4703]: I0309 13:36:35.479698 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/openstack-galera-2"] Mar 09 13:36:36 crc kubenswrapper[4703]: I0309 13:36:36.336109 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfxln" event={"ID":"96332f87-9756-458e-881a-5e97b24c1591","Type":"ContainerStarted","Data":"77ac96d16b523eb4ff1ea57ae452cbf9c11f1bcf05f6554f06e5cf7983ca8034"} Mar 09 13:36:36 crc kubenswrapper[4703]: I0309 13:36:36.337114 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-2" event={"ID":"03a80206-7d7f-42bc-b3ce-57abda992a6a","Type":"ContainerStarted","Data":"08a72f25cbbb69e93ea3bbe3b3869b3cffe9b41ddac4c01ec0c8bb001043574b"} Mar 09 13:36:36 crc kubenswrapper[4703]: I0309 13:36:36.337943 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-1" event={"ID":"06f52029-696b-414e-a98e-0266b9f71c15","Type":"ContainerStarted","Data":"28a6046a870ef50a5d8f8e278e8c4eb4dd9f779df681c6e0f3f65978e6f5834b"} Mar 09 13:36:36 crc kubenswrapper[4703]: I0309 13:36:36.339197 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-0" event={"ID":"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312","Type":"ContainerStarted","Data":"6f8e27954b34a736c62fac15fd26f03de013d5893e442f5cab5a9c8c600a823e"} Mar 09 13:36:36 crc kubenswrapper[4703]: I0309 13:36:36.355664 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cfxln" podStartSLOduration=1.8816298580000002 podStartE2EDuration="4.355647347s" podCreationTimestamp="2026-03-09 13:36:32 +0000 UTC" firstStartedPulling="2026-03-09 13:36:33.278151651 +0000 UTC m=+989.245567337" lastFinishedPulling="2026-03-09 13:36:35.75216913 +0000 UTC m=+991.719584826" observedRunningTime="2026-03-09 13:36:36.352834831 +0000 UTC m=+992.320250517" watchObservedRunningTime="2026-03-09 13:36:36.355647347 +0000 UTC m=+992.323063033" Mar 09 13:36:39 crc kubenswrapper[4703]: I0309 13:36:39.500358 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:36:39 crc kubenswrapper[4703]: I0309 13:36:39.500701 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:36:41 crc kubenswrapper[4703]: I0309 13:36:41.314881 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:36:42 crc kubenswrapper[4703]: I0309 13:36:42.714921 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:42 crc kubenswrapper[4703]: I0309 13:36:42.714957 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:42 crc kubenswrapper[4703]: I0309 13:36:42.765176 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.379726 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/memcached-0"] Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.380694 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.384054 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"manila-kuttl-tests"/"memcached-config-data" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.384277 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"memcached-memcached-dockercfg-nbpb7" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.397640 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/memcached-0"] Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.405792 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kolla-config\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.405910 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b77qp\" (UniqueName: \"kubernetes.io/projected/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kube-api-access-b77qp\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.405939 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-config-data\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.429162 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.506897 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b77qp\" (UniqueName: \"kubernetes.io/projected/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kube-api-access-b77qp\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.506945 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-config-data\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.507050 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kolla-config\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.507791 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kolla-config\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.507987 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-config-data\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.528088 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b77qp\" (UniqueName: \"kubernetes.io/projected/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kube-api-access-b77qp\") pod \"memcached-0\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:43 crc kubenswrapper[4703]: I0309 13:36:43.698884 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:44 crc kubenswrapper[4703]: I0309 13:36:44.391418 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-0" event={"ID":"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312","Type":"ContainerStarted","Data":"31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757"} Mar 09 13:36:44 crc kubenswrapper[4703]: I0309 13:36:44.393167 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-2" event={"ID":"03a80206-7d7f-42bc-b3ce-57abda992a6a","Type":"ContainerStarted","Data":"02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235"} Mar 09 13:36:44 crc kubenswrapper[4703]: I0309 13:36:44.395684 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-1" event={"ID":"06f52029-696b-414e-a98e-0266b9f71c15","Type":"ContainerStarted","Data":"937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e"} Mar 09 13:36:44 crc kubenswrapper[4703]: I0309 13:36:44.435596 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/memcached-0"] Mar 09 13:36:44 crc kubenswrapper[4703]: W0309 13:36:44.435628 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b9d9316_361e_431d_ad43_9fd1a8cb72c1.slice/crio-917b2310b7527e63cea3f99f09685f63e7d8b91fae0c52252d86c8a8543d368b WatchSource:0}: Error finding container 917b2310b7527e63cea3f99f09685f63e7d8b91fae0c52252d86c8a8543d368b: Status 404 returned error can't find the container with id 917b2310b7527e63cea3f99f09685f63e7d8b91fae0c52252d86c8a8543d368b Mar 09 13:36:45 crc kubenswrapper[4703]: I0309 13:36:45.399664 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/memcached-0" event={"ID":"9b9d9316-361e-431d-ad43-9fd1a8cb72c1","Type":"ContainerStarted","Data":"917b2310b7527e63cea3f99f09685f63e7d8b91fae0c52252d86c8a8543d368b"} Mar 09 13:36:46 crc kubenswrapper[4703]: I0309 13:36:46.782291 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfxln"] Mar 09 13:36:46 crc kubenswrapper[4703]: I0309 13:36:46.782776 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cfxln" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="registry-server" containerID="cri-o://77ac96d16b523eb4ff1ea57ae452cbf9c11f1bcf05f6554f06e5cf7983ca8034" gracePeriod=2 Mar 09 13:36:46 crc kubenswrapper[4703]: E0309 13:36:46.895811 4703 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96332f87_9756_458e_881a_5e97b24c1591.slice/crio-77ac96d16b523eb4ff1ea57ae452cbf9c11f1bcf05f6554f06e5cf7983ca8034.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96332f87_9756_458e_881a_5e97b24c1591.slice/crio-conmon-77ac96d16b523eb4ff1ea57ae452cbf9c11f1bcf05f6554f06e5cf7983ca8034.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.182717 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lw4f8"] Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.183553 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.185672 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-hj254" Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.193582 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lw4f8"] Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.258067 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm4xk\" (UniqueName: \"kubernetes.io/projected/54568517-643e-4d68-bb0b-daf6562bb23d-kube-api-access-vm4xk\") pod \"rabbitmq-cluster-operator-index-lw4f8\" (UID: \"54568517-643e-4d68-bb0b-daf6562bb23d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.359355 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm4xk\" (UniqueName: \"kubernetes.io/projected/54568517-643e-4d68-bb0b-daf6562bb23d-kube-api-access-vm4xk\") pod \"rabbitmq-cluster-operator-index-lw4f8\" (UID: \"54568517-643e-4d68-bb0b-daf6562bb23d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.380885 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm4xk\" (UniqueName: \"kubernetes.io/projected/54568517-643e-4d68-bb0b-daf6562bb23d-kube-api-access-vm4xk\") pod \"rabbitmq-cluster-operator-index-lw4f8\" (UID: \"54568517-643e-4d68-bb0b-daf6562bb23d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.416368 4703 generic.go:334] "Generic (PLEG): container finished" podID="96332f87-9756-458e-881a-5e97b24c1591" containerID="77ac96d16b523eb4ff1ea57ae452cbf9c11f1bcf05f6554f06e5cf7983ca8034" exitCode=0 Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.416430 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfxln" event={"ID":"96332f87-9756-458e-881a-5e97b24c1591","Type":"ContainerDied","Data":"77ac96d16b523eb4ff1ea57ae452cbf9c11f1bcf05f6554f06e5cf7983ca8034"} Mar 09 13:36:47 crc kubenswrapper[4703]: I0309 13:36:47.503065 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.047889 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lw4f8"] Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.089616 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.182005 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf86b\" (UniqueName: \"kubernetes.io/projected/96332f87-9756-458e-881a-5e97b24c1591-kube-api-access-bf86b\") pod \"96332f87-9756-458e-881a-5e97b24c1591\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.182082 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-catalog-content\") pod \"96332f87-9756-458e-881a-5e97b24c1591\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.182121 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-utilities\") pod \"96332f87-9756-458e-881a-5e97b24c1591\" (UID: \"96332f87-9756-458e-881a-5e97b24c1591\") " Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.183059 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-utilities" (OuterVolumeSpecName: "utilities") pod "96332f87-9756-458e-881a-5e97b24c1591" (UID: "96332f87-9756-458e-881a-5e97b24c1591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.188354 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96332f87-9756-458e-881a-5e97b24c1591-kube-api-access-bf86b" (OuterVolumeSpecName: "kube-api-access-bf86b") pod "96332f87-9756-458e-881a-5e97b24c1591" (UID: "96332f87-9756-458e-881a-5e97b24c1591"). InnerVolumeSpecName "kube-api-access-bf86b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.232031 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96332f87-9756-458e-881a-5e97b24c1591" (UID: "96332f87-9756-458e-881a-5e97b24c1591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.284218 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf86b\" (UniqueName: \"kubernetes.io/projected/96332f87-9756-458e-881a-5e97b24c1591-kube-api-access-bf86b\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.284485 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.284558 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96332f87-9756-458e-881a-5e97b24c1591-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.421989 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" event={"ID":"54568517-643e-4d68-bb0b-daf6562bb23d","Type":"ContainerStarted","Data":"3d36ca4826ec0c80883da1069d981a1c8ce207d31bf56778673643c0c3063c3f"} Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.424750 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfxln" event={"ID":"96332f87-9756-458e-881a-5e97b24c1591","Type":"ContainerDied","Data":"3e2f66ef7dae6f33bf1dd7611ddca690f222b033e1badbff8e246ebb34f58f54"} Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.424806 4703 scope.go:117] "RemoveContainer" containerID="77ac96d16b523eb4ff1ea57ae452cbf9c11f1bcf05f6554f06e5cf7983ca8034" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.424765 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfxln" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.426137 4703 generic.go:334] "Generic (PLEG): container finished" podID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerID="02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235" exitCode=0 Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.426207 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-2" event={"ID":"03a80206-7d7f-42bc-b3ce-57abda992a6a","Type":"ContainerDied","Data":"02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235"} Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.427292 4703 generic.go:334] "Generic (PLEG): container finished" podID="06f52029-696b-414e-a98e-0266b9f71c15" containerID="937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e" exitCode=0 Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.427386 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-1" event={"ID":"06f52029-696b-414e-a98e-0266b9f71c15","Type":"ContainerDied","Data":"937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e"} Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.433370 4703 generic.go:334] "Generic (PLEG): container finished" podID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerID="31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757" exitCode=0 Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.433564 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-0" event={"ID":"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312","Type":"ContainerDied","Data":"31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757"} Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.440146 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/memcached-0" event={"ID":"9b9d9316-361e-431d-ad43-9fd1a8cb72c1","Type":"ContainerStarted","Data":"7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c"} Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.440734 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.452891 4703 scope.go:117] "RemoveContainer" containerID="0074998fe6250b3fda493a248c8e4a30666a2d0494885ccc1f767dabf62b8c26" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.520305 4703 scope.go:117] "RemoveContainer" containerID="881b6a5a154b88c6d43e67f198bb7192ffe6c5f814860b2c313a791a61401142" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.553819 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/memcached-0" podStartSLOduration=2.103836612 podStartE2EDuration="5.553803937s" podCreationTimestamp="2026-03-09 13:36:43 +0000 UTC" firstStartedPulling="2026-03-09 13:36:44.438499607 +0000 UTC m=+1000.405915293" lastFinishedPulling="2026-03-09 13:36:47.888466932 +0000 UTC m=+1003.855882618" observedRunningTime="2026-03-09 13:36:48.510367437 +0000 UTC m=+1004.477783123" watchObservedRunningTime="2026-03-09 13:36:48.553803937 +0000 UTC m=+1004.521219623" Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.572114 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfxln"] Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.582881 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cfxln"] Mar 09 13:36:48 crc kubenswrapper[4703]: I0309 13:36:48.718162 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96332f87-9756-458e-881a-5e97b24c1591" path="/var/lib/kubelet/pods/96332f87-9756-458e-881a-5e97b24c1591/volumes" Mar 09 13:36:49 crc kubenswrapper[4703]: I0309 13:36:49.450447 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-0" event={"ID":"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312","Type":"ContainerStarted","Data":"316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9"} Mar 09 13:36:49 crc kubenswrapper[4703]: I0309 13:36:49.455156 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-2" event={"ID":"03a80206-7d7f-42bc-b3ce-57abda992a6a","Type":"ContainerStarted","Data":"01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f"} Mar 09 13:36:49 crc kubenswrapper[4703]: I0309 13:36:49.457408 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-1" event={"ID":"06f52029-696b-414e-a98e-0266b9f71c15","Type":"ContainerStarted","Data":"626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3"} Mar 09 13:36:49 crc kubenswrapper[4703]: I0309 13:36:49.482059 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/openstack-galera-0" podStartSLOduration=7.807596424 podStartE2EDuration="16.48204633s" podCreationTimestamp="2026-03-09 13:36:33 +0000 UTC" firstStartedPulling="2026-03-09 13:36:35.486463248 +0000 UTC m=+991.453878934" lastFinishedPulling="2026-03-09 13:36:44.160913154 +0000 UTC m=+1000.128328840" observedRunningTime="2026-03-09 13:36:49.476620643 +0000 UTC m=+1005.444036329" watchObservedRunningTime="2026-03-09 13:36:49.48204633 +0000 UTC m=+1005.449462016" Mar 09 13:36:49 crc kubenswrapper[4703]: I0309 13:36:49.540519 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/openstack-galera-1" podStartSLOduration=7.838945418 podStartE2EDuration="16.540502465s" podCreationTimestamp="2026-03-09 13:36:33 +0000 UTC" firstStartedPulling="2026-03-09 13:36:35.392569057 +0000 UTC m=+991.359984743" lastFinishedPulling="2026-03-09 13:36:44.094126104 +0000 UTC m=+1000.061541790" observedRunningTime="2026-03-09 13:36:49.539092627 +0000 UTC m=+1005.506508313" watchObservedRunningTime="2026-03-09 13:36:49.540502465 +0000 UTC m=+1005.507918151" Mar 09 13:36:49 crc kubenswrapper[4703]: I0309 13:36:49.541499 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/openstack-galera-2" podStartSLOduration=7.981227994 podStartE2EDuration="16.541494012s" podCreationTimestamp="2026-03-09 13:36:33 +0000 UTC" firstStartedPulling="2026-03-09 13:36:35.489942362 +0000 UTC m=+991.457358048" lastFinishedPulling="2026-03-09 13:36:44.05020834 +0000 UTC m=+1000.017624066" observedRunningTime="2026-03-09 13:36:49.510958069 +0000 UTC m=+1005.478373755" watchObservedRunningTime="2026-03-09 13:36:49.541494012 +0000 UTC m=+1005.508909688" Mar 09 13:36:51 crc kubenswrapper[4703]: I0309 13:36:51.472038 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" event={"ID":"54568517-643e-4d68-bb0b-daf6562bb23d","Type":"ContainerStarted","Data":"78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74"} Mar 09 13:36:51 crc kubenswrapper[4703]: I0309 13:36:51.492837 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" podStartSLOduration=1.2711574780000001 podStartE2EDuration="4.492821341s" podCreationTimestamp="2026-03-09 13:36:47 +0000 UTC" firstStartedPulling="2026-03-09 13:36:48.071987479 +0000 UTC m=+1004.039403165" lastFinishedPulling="2026-03-09 13:36:51.293651342 +0000 UTC m=+1007.261067028" observedRunningTime="2026-03-09 13:36:51.489125151 +0000 UTC m=+1007.456540847" watchObservedRunningTime="2026-03-09 13:36:51.492821341 +0000 UTC m=+1007.460237027" Mar 09 13:36:53 crc kubenswrapper[4703]: I0309 13:36:53.700079 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/memcached-0" Mar 09 13:36:54 crc kubenswrapper[4703]: I0309 13:36:54.990823 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:54 crc kubenswrapper[4703]: I0309 13:36:54.990892 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:36:55 crc kubenswrapper[4703]: I0309 13:36:55.015106 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:55 crc kubenswrapper[4703]: I0309 13:36:55.015564 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:36:55 crc kubenswrapper[4703]: I0309 13:36:55.018492 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:55 crc kubenswrapper[4703]: I0309 13:36:55.018558 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:57 crc kubenswrapper[4703]: I0309 13:36:57.351867 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:57 crc kubenswrapper[4703]: I0309 13:36:57.460190 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:36:57 crc kubenswrapper[4703]: I0309 13:36:57.504323 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:57 crc kubenswrapper[4703]: I0309 13:36:57.504376 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:57 crc kubenswrapper[4703]: I0309 13:36:57.528706 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:36:58 crc kubenswrapper[4703]: I0309 13:36:58.546935 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.734841 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/root-account-create-update-95mws"] Mar 09 13:37:03 crc kubenswrapper[4703]: E0309 13:37:03.735319 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="registry-server" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.735332 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="registry-server" Mar 09 13:37:03 crc kubenswrapper[4703]: E0309 13:37:03.735343 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="extract-utilities" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.735349 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="extract-utilities" Mar 09 13:37:03 crc kubenswrapper[4703]: E0309 13:37:03.735361 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="extract-content" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.735367 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="extract-content" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.735477 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="96332f87-9756-458e-881a-5e97b24c1591" containerName="registry-server" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.735885 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.738122 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.756289 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/root-account-create-update-95mws"] Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.902111 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjvc\" (UniqueName: \"kubernetes.io/projected/c10cc726-da66-44d2-9dc8-7af3f4afce0e-kube-api-access-wqjvc\") pod \"root-account-create-update-95mws\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:03 crc kubenswrapper[4703]: I0309 13:37:03.902170 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10cc726-da66-44d2-9dc8-7af3f4afce0e-operator-scripts\") pod \"root-account-create-update-95mws\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:04 crc kubenswrapper[4703]: I0309 13:37:04.003984 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjvc\" (UniqueName: \"kubernetes.io/projected/c10cc726-da66-44d2-9dc8-7af3f4afce0e-kube-api-access-wqjvc\") pod \"root-account-create-update-95mws\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:04 crc kubenswrapper[4703]: I0309 13:37:04.004048 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10cc726-da66-44d2-9dc8-7af3f4afce0e-operator-scripts\") pod \"root-account-create-update-95mws\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:04 crc kubenswrapper[4703]: I0309 13:37:04.005059 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10cc726-da66-44d2-9dc8-7af3f4afce0e-operator-scripts\") pod \"root-account-create-update-95mws\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:04 crc kubenswrapper[4703]: I0309 13:37:04.027282 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjvc\" (UniqueName: \"kubernetes.io/projected/c10cc726-da66-44d2-9dc8-7af3f4afce0e-kube-api-access-wqjvc\") pod \"root-account-create-update-95mws\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:04 crc kubenswrapper[4703]: I0309 13:37:04.052179 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.094911 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="manila-kuttl-tests/openstack-galera-2" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerName="galera" probeResult="failure" output=< Mar 09 13:37:05 crc kubenswrapper[4703]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Mar 09 13:37:05 crc kubenswrapper[4703]: > Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.839069 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/root-account-create-update-95mws"] Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.853864 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.858793 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk"] Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.860144 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.870323 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cxl8l" Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.875290 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk"] Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.928912 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwt6\" (UniqueName: \"kubernetes.io/projected/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-kube-api-access-rbwt6\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.928978 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:05 crc kubenswrapper[4703]: I0309 13:37:05.929003 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.030384 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.031175 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.031338 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwt6\" (UniqueName: \"kubernetes.io/projected/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-kube-api-access-rbwt6\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.031643 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.032029 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.057780 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwt6\" (UniqueName: \"kubernetes.io/projected/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-kube-api-access-rbwt6\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.199276 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.574103 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/root-account-create-update-95mws" event={"ID":"c10cc726-da66-44d2-9dc8-7af3f4afce0e","Type":"ContainerStarted","Data":"8fab6a2da38a694c4e883d9ae79143e9450498160afbad5228424f3c025fbd8c"} Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.574434 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/root-account-create-update-95mws" event={"ID":"c10cc726-da66-44d2-9dc8-7af3f4afce0e","Type":"ContainerStarted","Data":"fd1ef5829bc1769168e49dfa2cdec746bef25a8dd6c26cef1c3da651bfd54b90"} Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.592936 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/root-account-create-update-95mws" podStartSLOduration=3.592919767 podStartE2EDuration="3.592919767s" podCreationTimestamp="2026-03-09 13:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:37:06.590406259 +0000 UTC m=+1022.557821945" watchObservedRunningTime="2026-03-09 13:37:06.592919767 +0000 UTC m=+1022.560335453" Mar 09 13:37:06 crc kubenswrapper[4703]: I0309 13:37:06.666485 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk"] Mar 09 13:37:06 crc kubenswrapper[4703]: W0309 13:37:06.669829 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb5a75be_0890_4f1c_988b_ac1f0d2399b3.slice/crio-acbba2ebaf02f47fc2bbbcc547d4df9fe75559da5f36334618ca421f0432d97f WatchSource:0}: Error finding container acbba2ebaf02f47fc2bbbcc547d4df9fe75559da5f36334618ca421f0432d97f: Status 404 returned error can't find the container with id acbba2ebaf02f47fc2bbbcc547d4df9fe75559da5f36334618ca421f0432d97f Mar 09 13:37:07 crc kubenswrapper[4703]: I0309 13:37:07.583305 4703 generic.go:334] "Generic (PLEG): container finished" podID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerID="85faee2f4db6f602d51c14b0f8c52833092bd9ea8e7598c2f93a7c4e35b72d11" exitCode=0 Mar 09 13:37:07 crc kubenswrapper[4703]: I0309 13:37:07.583356 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" event={"ID":"eb5a75be-0890-4f1c-988b-ac1f0d2399b3","Type":"ContainerDied","Data":"85faee2f4db6f602d51c14b0f8c52833092bd9ea8e7598c2f93a7c4e35b72d11"} Mar 09 13:37:07 crc kubenswrapper[4703]: I0309 13:37:07.583717 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" event={"ID":"eb5a75be-0890-4f1c-988b-ac1f0d2399b3","Type":"ContainerStarted","Data":"acbba2ebaf02f47fc2bbbcc547d4df9fe75559da5f36334618ca421f0432d97f"} Mar 09 13:37:07 crc kubenswrapper[4703]: I0309 13:37:07.586406 4703 generic.go:334] "Generic (PLEG): container finished" podID="c10cc726-da66-44d2-9dc8-7af3f4afce0e" containerID="8fab6a2da38a694c4e883d9ae79143e9450498160afbad5228424f3c025fbd8c" exitCode=0 Mar 09 13:37:07 crc kubenswrapper[4703]: I0309 13:37:07.586447 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/root-account-create-update-95mws" event={"ID":"c10cc726-da66-44d2-9dc8-7af3f4afce0e","Type":"ContainerDied","Data":"8fab6a2da38a694c4e883d9ae79143e9450498160afbad5228424f3c025fbd8c"} Mar 09 13:37:08 crc kubenswrapper[4703]: I0309 13:37:08.596839 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" event={"ID":"eb5a75be-0890-4f1c-988b-ac1f0d2399b3","Type":"ContainerStarted","Data":"25a3def67eb3dcf50b633a42ddc5bb3f82877c01845ab257ad0e0ef4d1188df7"} Mar 09 13:37:08 crc kubenswrapper[4703]: I0309 13:37:08.951321 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.080115 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqjvc\" (UniqueName: \"kubernetes.io/projected/c10cc726-da66-44d2-9dc8-7af3f4afce0e-kube-api-access-wqjvc\") pod \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.080185 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10cc726-da66-44d2-9dc8-7af3f4afce0e-operator-scripts\") pod \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\" (UID: \"c10cc726-da66-44d2-9dc8-7af3f4afce0e\") " Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.081125 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c10cc726-da66-44d2-9dc8-7af3f4afce0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c10cc726-da66-44d2-9dc8-7af3f4afce0e" (UID: "c10cc726-da66-44d2-9dc8-7af3f4afce0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.088085 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c10cc726-da66-44d2-9dc8-7af3f4afce0e-kube-api-access-wqjvc" (OuterVolumeSpecName: "kube-api-access-wqjvc") pod "c10cc726-da66-44d2-9dc8-7af3f4afce0e" (UID: "c10cc726-da66-44d2-9dc8-7af3f4afce0e"). InnerVolumeSpecName "kube-api-access-wqjvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.181692 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqjvc\" (UniqueName: \"kubernetes.io/projected/c10cc726-da66-44d2-9dc8-7af3f4afce0e-kube-api-access-wqjvc\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.181727 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c10cc726-da66-44d2-9dc8-7af3f4afce0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.499917 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.499967 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.500019 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.500499 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f24d1f6f12dd9f2dbe4d447d5f4ec4a161023da53dbcd799df4861c45de4b0c"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.500552 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://5f24d1f6f12dd9f2dbe4d447d5f4ec4a161023da53dbcd799df4861c45de4b0c" gracePeriod=600 Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.603584 4703 generic.go:334] "Generic (PLEG): container finished" podID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerID="25a3def67eb3dcf50b633a42ddc5bb3f82877c01845ab257ad0e0ef4d1188df7" exitCode=0 Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.603648 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" event={"ID":"eb5a75be-0890-4f1c-988b-ac1f0d2399b3","Type":"ContainerDied","Data":"25a3def67eb3dcf50b633a42ddc5bb3f82877c01845ab257ad0e0ef4d1188df7"} Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.606547 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/root-account-create-update-95mws" event={"ID":"c10cc726-da66-44d2-9dc8-7af3f4afce0e","Type":"ContainerDied","Data":"fd1ef5829bc1769168e49dfa2cdec746bef25a8dd6c26cef1c3da651bfd54b90"} Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.606828 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd1ef5829bc1769168e49dfa2cdec746bef25a8dd6c26cef1c3da651bfd54b90" Mar 09 13:37:09 crc kubenswrapper[4703]: I0309 13:37:09.606944 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-95mws" Mar 09 13:37:10 crc kubenswrapper[4703]: I0309 13:37:10.614889 4703 generic.go:334] "Generic (PLEG): container finished" podID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerID="accc86158e1720e2f0e11414c1e0992a7c4e7b8a90e35f318da65c64cb8ae708" exitCode=0 Mar 09 13:37:10 crc kubenswrapper[4703]: I0309 13:37:10.614966 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" event={"ID":"eb5a75be-0890-4f1c-988b-ac1f0d2399b3","Type":"ContainerDied","Data":"accc86158e1720e2f0e11414c1e0992a7c4e7b8a90e35f318da65c64cb8ae708"} Mar 09 13:37:10 crc kubenswrapper[4703]: I0309 13:37:10.617479 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="5f24d1f6f12dd9f2dbe4d447d5f4ec4a161023da53dbcd799df4861c45de4b0c" exitCode=0 Mar 09 13:37:10 crc kubenswrapper[4703]: I0309 13:37:10.617515 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"5f24d1f6f12dd9f2dbe4d447d5f4ec4a161023da53dbcd799df4861c45de4b0c"} Mar 09 13:37:10 crc kubenswrapper[4703]: I0309 13:37:10.617540 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"10cd3ad358458f317f93eedfe061a2094205d9a36cfaa6732883cb518fdce4bc"} Mar 09 13:37:10 crc kubenswrapper[4703]: I0309 13:37:10.617557 4703 scope.go:117] "RemoveContainer" containerID="9a30d8d9d6908c22df3252ca1b072edab287dd75aa2220dbdad578c0ef22ecae" Mar 09 13:37:11 crc kubenswrapper[4703]: I0309 13:37:11.097769 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:37:11 crc kubenswrapper[4703]: I0309 13:37:11.215667 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:37:11 crc kubenswrapper[4703]: I0309 13:37:11.959588 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.035967 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbwt6\" (UniqueName: \"kubernetes.io/projected/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-kube-api-access-rbwt6\") pod \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.036108 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-util\") pod \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.036215 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-bundle\") pod \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\" (UID: \"eb5a75be-0890-4f1c-988b-ac1f0d2399b3\") " Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.037634 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-bundle" (OuterVolumeSpecName: "bundle") pod "eb5a75be-0890-4f1c-988b-ac1f0d2399b3" (UID: "eb5a75be-0890-4f1c-988b-ac1f0d2399b3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.042484 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-kube-api-access-rbwt6" (OuterVolumeSpecName: "kube-api-access-rbwt6") pod "eb5a75be-0890-4f1c-988b-ac1f0d2399b3" (UID: "eb5a75be-0890-4f1c-988b-ac1f0d2399b3"). InnerVolumeSpecName "kube-api-access-rbwt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.052743 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-util" (OuterVolumeSpecName: "util") pod "eb5a75be-0890-4f1c-988b-ac1f0d2399b3" (UID: "eb5a75be-0890-4f1c-988b-ac1f0d2399b3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.137710 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.137742 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.137753 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbwt6\" (UniqueName: \"kubernetes.io/projected/eb5a75be-0890-4f1c-988b-ac1f0d2399b3-kube-api-access-rbwt6\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.278502 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.405400 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.635448 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" event={"ID":"eb5a75be-0890-4f1c-988b-ac1f0d2399b3","Type":"ContainerDied","Data":"acbba2ebaf02f47fc2bbbcc547d4df9fe75559da5f36334618ca421f0432d97f"} Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.635475 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk" Mar 09 13:37:12 crc kubenswrapper[4703]: I0309 13:37:12.635499 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acbba2ebaf02f47fc2bbbcc547d4df9fe75559da5f36334618ca421f0432d97f" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.722932 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9"] Mar 09 13:37:22 crc kubenswrapper[4703]: E0309 13:37:22.724300 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerName="util" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.724334 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerName="util" Mar 09 13:37:22 crc kubenswrapper[4703]: E0309 13:37:22.724357 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c10cc726-da66-44d2-9dc8-7af3f4afce0e" containerName="mariadb-account-create-update" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.724374 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="c10cc726-da66-44d2-9dc8-7af3f4afce0e" containerName="mariadb-account-create-update" Mar 09 13:37:22 crc kubenswrapper[4703]: E0309 13:37:22.724397 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerName="extract" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.724413 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerName="extract" Mar 09 13:37:22 crc kubenswrapper[4703]: E0309 13:37:22.724451 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerName="pull" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.724468 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerName="pull" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.724769 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="c10cc726-da66-44d2-9dc8-7af3f4afce0e" containerName="mariadb-account-create-update" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.724820 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" containerName="extract" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.725595 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.728882 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-wz7kq" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.745690 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9"] Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.895220 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pltbj\" (UniqueName: \"kubernetes.io/projected/f7c690e1-16e7-4418-b82c-a846f6de3430-kube-api-access-pltbj\") pod \"rabbitmq-cluster-operator-779fc9694b-wv5q9\" (UID: \"f7c690e1-16e7-4418-b82c-a846f6de3430\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" Mar 09 13:37:22 crc kubenswrapper[4703]: I0309 13:37:22.996775 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pltbj\" (UniqueName: \"kubernetes.io/projected/f7c690e1-16e7-4418-b82c-a846f6de3430-kube-api-access-pltbj\") pod \"rabbitmq-cluster-operator-779fc9694b-wv5q9\" (UID: \"f7c690e1-16e7-4418-b82c-a846f6de3430\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" Mar 09 13:37:23 crc kubenswrapper[4703]: I0309 13:37:23.014131 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pltbj\" (UniqueName: \"kubernetes.io/projected/f7c690e1-16e7-4418-b82c-a846f6de3430-kube-api-access-pltbj\") pod \"rabbitmq-cluster-operator-779fc9694b-wv5q9\" (UID: \"f7c690e1-16e7-4418-b82c-a846f6de3430\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" Mar 09 13:37:23 crc kubenswrapper[4703]: I0309 13:37:23.053131 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" Mar 09 13:37:23 crc kubenswrapper[4703]: I0309 13:37:23.559290 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9"] Mar 09 13:37:23 crc kubenswrapper[4703]: I0309 13:37:23.711075 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" event={"ID":"f7c690e1-16e7-4418-b82c-a846f6de3430","Type":"ContainerStarted","Data":"a736d62c855a173ab0b0b51082f8a8f4b0812170b8a7acfdb7c390d36e998615"} Mar 09 13:37:26 crc kubenswrapper[4703]: I0309 13:37:26.737085 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" event={"ID":"f7c690e1-16e7-4418-b82c-a846f6de3430","Type":"ContainerStarted","Data":"2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2"} Mar 09 13:37:26 crc kubenswrapper[4703]: I0309 13:37:26.763841 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" podStartSLOduration=1.7478889579999999 podStartE2EDuration="4.763812166s" podCreationTimestamp="2026-03-09 13:37:22 +0000 UTC" firstStartedPulling="2026-03-09 13:37:23.568019798 +0000 UTC m=+1039.535435484" lastFinishedPulling="2026-03-09 13:37:26.583942996 +0000 UTC m=+1042.551358692" observedRunningTime="2026-03-09 13:37:26.753593598 +0000 UTC m=+1042.721009284" watchObservedRunningTime="2026-03-09 13:37:26.763812166 +0000 UTC m=+1042.731227892" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.612359 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/rabbitmq-server-0"] Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.614760 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.617114 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"manila-kuttl-tests"/"rabbitmq-server-conf" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.617259 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"rabbitmq-default-user" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.617116 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"rabbitmq-server-dockercfg-f6d6h" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.617467 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"rabbitmq-erlang-cookie" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.617536 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"manila-kuttl-tests"/"rabbitmq-plugins-conf" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.624116 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/rabbitmq-server-0"] Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714302 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/379d4845-913a-4422-915c-221497738cde-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714642 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2bee0633-6abb-4275-a483-296d2526d79c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714670 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/379d4845-913a-4422-915c-221497738cde-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714705 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714728 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714775 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/379d4845-913a-4422-915c-221497738cde-pod-info\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714803 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.714821 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx6n2\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-kube-api-access-nx6n2\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.815747 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/379d4845-913a-4422-915c-221497738cde-pod-info\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.815858 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.815891 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx6n2\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-kube-api-access-nx6n2\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.815991 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/379d4845-913a-4422-915c-221497738cde-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.816022 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2bee0633-6abb-4275-a483-296d2526d79c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.816052 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/379d4845-913a-4422-915c-221497738cde-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.816084 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.816114 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.818564 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.820080 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/379d4845-913a-4422-915c-221497738cde-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.820290 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.822083 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/379d4845-913a-4422-915c-221497738cde-pod-info\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.824265 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.825443 4703 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.825474 4703 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2bee0633-6abb-4275-a483-296d2526d79c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e9b8721d890571fe7ff85c8343ceec75ea160fba723dacece784d9d6480cc89/globalmount\"" pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.839161 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/379d4845-913a-4422-915c-221497738cde-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.846941 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx6n2\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-kube-api-access-nx6n2\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.851432 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2bee0633-6abb-4275-a483-296d2526d79c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c\") pod \"rabbitmq-server-0\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:30 crc kubenswrapper[4703]: I0309 13:37:30.937831 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:37:31 crc kubenswrapper[4703]: I0309 13:37:31.364987 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/rabbitmq-server-0"] Mar 09 13:37:31 crc kubenswrapper[4703]: W0309 13:37:31.376069 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod379d4845_913a_4422_915c_221497738cde.slice/crio-0eeef2538e59eb0f5e0ef2d63f194b251cd0056f86012bb0b11d28704cf3d733 WatchSource:0}: Error finding container 0eeef2538e59eb0f5e0ef2d63f194b251cd0056f86012bb0b11d28704cf3d733: Status 404 returned error can't find the container with id 0eeef2538e59eb0f5e0ef2d63f194b251cd0056f86012bb0b11d28704cf3d733 Mar 09 13:37:31 crc kubenswrapper[4703]: I0309 13:37:31.775750 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/rabbitmq-server-0" event={"ID":"379d4845-913a-4422-915c-221497738cde","Type":"ContainerStarted","Data":"0eeef2538e59eb0f5e0ef2d63f194b251cd0056f86012bb0b11d28704cf3d733"} Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.191014 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-jmzc6"] Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.191704 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.196399 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-2wbtd" Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.209964 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-jmzc6"] Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.338126 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpghl\" (UniqueName: \"kubernetes.io/projected/895726c9-a46b-4503-b195-ef668833c34f-kube-api-access-kpghl\") pod \"keystone-operator-index-jmzc6\" (UID: \"895726c9-a46b-4503-b195-ef668833c34f\") " pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.439572 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpghl\" (UniqueName: \"kubernetes.io/projected/895726c9-a46b-4503-b195-ef668833c34f-kube-api-access-kpghl\") pod \"keystone-operator-index-jmzc6\" (UID: \"895726c9-a46b-4503-b195-ef668833c34f\") " pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.459289 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpghl\" (UniqueName: \"kubernetes.io/projected/895726c9-a46b-4503-b195-ef668833c34f-kube-api-access-kpghl\") pod \"keystone-operator-index-jmzc6\" (UID: \"895726c9-a46b-4503-b195-ef668833c34f\") " pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.517733 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:32 crc kubenswrapper[4703]: I0309 13:37:32.930955 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-jmzc6"] Mar 09 13:37:32 crc kubenswrapper[4703]: W0309 13:37:32.935003 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod895726c9_a46b_4503_b195_ef668833c34f.slice/crio-c0f3371a30fb2f001ec7194288b1b39d706774be10878b430bc24110c0034373 WatchSource:0}: Error finding container c0f3371a30fb2f001ec7194288b1b39d706774be10878b430bc24110c0034373: Status 404 returned error can't find the container with id c0f3371a30fb2f001ec7194288b1b39d706774be10878b430bc24110c0034373 Mar 09 13:37:33 crc kubenswrapper[4703]: I0309 13:37:33.833543 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jmzc6" event={"ID":"895726c9-a46b-4503-b195-ef668833c34f","Type":"ContainerStarted","Data":"c0f3371a30fb2f001ec7194288b1b39d706774be10878b430bc24110c0034373"} Mar 09 13:37:37 crc kubenswrapper[4703]: I0309 13:37:37.864737 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jmzc6" event={"ID":"895726c9-a46b-4503-b195-ef668833c34f","Type":"ContainerStarted","Data":"c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab"} Mar 09 13:37:37 crc kubenswrapper[4703]: I0309 13:37:37.906190 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-jmzc6" podStartSLOduration=1.604817614 podStartE2EDuration="5.906157706s" podCreationTimestamp="2026-03-09 13:37:32 +0000 UTC" firstStartedPulling="2026-03-09 13:37:32.936739253 +0000 UTC m=+1048.904154939" lastFinishedPulling="2026-03-09 13:37:37.238079305 +0000 UTC m=+1053.205495031" observedRunningTime="2026-03-09 13:37:37.896745831 +0000 UTC m=+1053.864161557" watchObservedRunningTime="2026-03-09 13:37:37.906157706 +0000 UTC m=+1053.873573432" Mar 09 13:37:38 crc kubenswrapper[4703]: I0309 13:37:38.875900 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/rabbitmq-server-0" event={"ID":"379d4845-913a-4422-915c-221497738cde","Type":"ContainerStarted","Data":"b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa"} Mar 09 13:37:42 crc kubenswrapper[4703]: I0309 13:37:42.518253 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:42 crc kubenswrapper[4703]: I0309 13:37:42.518625 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:42 crc kubenswrapper[4703]: I0309 13:37:42.551743 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:42 crc kubenswrapper[4703]: I0309 13:37:42.938922 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.663743 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg"] Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.665880 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.668205 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cxl8l" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.678487 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg"] Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.780319 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.780378 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6qjw\" (UniqueName: \"kubernetes.io/projected/27161253-4d46-49d6-a665-87451a02c056-kube-api-access-z6qjw\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.780716 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.882272 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.882575 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.882715 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6qjw\" (UniqueName: \"kubernetes.io/projected/27161253-4d46-49d6-a665-87451a02c056-kube-api-access-z6qjw\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.884000 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.884309 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.903631 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6qjw\" (UniqueName: \"kubernetes.io/projected/27161253-4d46-49d6-a665-87451a02c056-kube-api-access-z6qjw\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:50 crc kubenswrapper[4703]: I0309 13:37:50.988302 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:51 crc kubenswrapper[4703]: I0309 13:37:51.459884 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg"] Mar 09 13:37:51 crc kubenswrapper[4703]: I0309 13:37:51.981273 4703 generic.go:334] "Generic (PLEG): container finished" podID="27161253-4d46-49d6-a665-87451a02c056" containerID="4b96991b9e8c867a4213db405e1e2d71eb4a3587e1b31a33fbe86bd6fd21595d" exitCode=0 Mar 09 13:37:51 crc kubenswrapper[4703]: I0309 13:37:51.981330 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" event={"ID":"27161253-4d46-49d6-a665-87451a02c056","Type":"ContainerDied","Data":"4b96991b9e8c867a4213db405e1e2d71eb4a3587e1b31a33fbe86bd6fd21595d"} Mar 09 13:37:51 crc kubenswrapper[4703]: I0309 13:37:51.981592 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" event={"ID":"27161253-4d46-49d6-a665-87451a02c056","Type":"ContainerStarted","Data":"366fc1bbbe0a8d718964cbcd0527551a3abca7a0c9fdd242c6846db063b223e6"} Mar 09 13:37:52 crc kubenswrapper[4703]: I0309 13:37:52.993254 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" event={"ID":"27161253-4d46-49d6-a665-87451a02c056","Type":"ContainerStarted","Data":"e385bf13f192160720fbf313777e508e74c6f8f981a405b01cd620e2c5854113"} Mar 09 13:37:54 crc kubenswrapper[4703]: I0309 13:37:54.005322 4703 generic.go:334] "Generic (PLEG): container finished" podID="27161253-4d46-49d6-a665-87451a02c056" containerID="e385bf13f192160720fbf313777e508e74c6f8f981a405b01cd620e2c5854113" exitCode=0 Mar 09 13:37:54 crc kubenswrapper[4703]: I0309 13:37:54.005403 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" event={"ID":"27161253-4d46-49d6-a665-87451a02c056","Type":"ContainerDied","Data":"e385bf13f192160720fbf313777e508e74c6f8f981a405b01cd620e2c5854113"} Mar 09 13:37:55 crc kubenswrapper[4703]: I0309 13:37:55.017139 4703 generic.go:334] "Generic (PLEG): container finished" podID="27161253-4d46-49d6-a665-87451a02c056" containerID="babfd5640e89611c3f0808f3bdc73a9f43dcff78e059456e126a35336c10a9b4" exitCode=0 Mar 09 13:37:55 crc kubenswrapper[4703]: I0309 13:37:55.017478 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" event={"ID":"27161253-4d46-49d6-a665-87451a02c056","Type":"ContainerDied","Data":"babfd5640e89611c3f0808f3bdc73a9f43dcff78e059456e126a35336c10a9b4"} Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.278214 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.378589 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6qjw\" (UniqueName: \"kubernetes.io/projected/27161253-4d46-49d6-a665-87451a02c056-kube-api-access-z6qjw\") pod \"27161253-4d46-49d6-a665-87451a02c056\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.379391 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-util\") pod \"27161253-4d46-49d6-a665-87451a02c056\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.379630 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-bundle\") pod \"27161253-4d46-49d6-a665-87451a02c056\" (UID: \"27161253-4d46-49d6-a665-87451a02c056\") " Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.381098 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-bundle" (OuterVolumeSpecName: "bundle") pod "27161253-4d46-49d6-a665-87451a02c056" (UID: "27161253-4d46-49d6-a665-87451a02c056"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.385322 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27161253-4d46-49d6-a665-87451a02c056-kube-api-access-z6qjw" (OuterVolumeSpecName: "kube-api-access-z6qjw") pod "27161253-4d46-49d6-a665-87451a02c056" (UID: "27161253-4d46-49d6-a665-87451a02c056"). InnerVolumeSpecName "kube-api-access-z6qjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.399555 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-util" (OuterVolumeSpecName: "util") pod "27161253-4d46-49d6-a665-87451a02c056" (UID: "27161253-4d46-49d6-a665-87451a02c056"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.481235 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6qjw\" (UniqueName: \"kubernetes.io/projected/27161253-4d46-49d6-a665-87451a02c056-kube-api-access-z6qjw\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.481285 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:56 crc kubenswrapper[4703]: I0309 13:37:56.481302 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27161253-4d46-49d6-a665-87451a02c056-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:37:57 crc kubenswrapper[4703]: I0309 13:37:57.038085 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" event={"ID":"27161253-4d46-49d6-a665-87451a02c056","Type":"ContainerDied","Data":"366fc1bbbe0a8d718964cbcd0527551a3abca7a0c9fdd242c6846db063b223e6"} Mar 09 13:37:57 crc kubenswrapper[4703]: I0309 13:37:57.038154 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="366fc1bbbe0a8d718964cbcd0527551a3abca7a0c9fdd242c6846db063b223e6" Mar 09 13:37:57 crc kubenswrapper[4703]: I0309 13:37:57.038228 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.152312 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551058-s5rq7"] Mar 09 13:38:00 crc kubenswrapper[4703]: E0309 13:38:00.153206 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27161253-4d46-49d6-a665-87451a02c056" containerName="util" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.153225 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="27161253-4d46-49d6-a665-87451a02c056" containerName="util" Mar 09 13:38:00 crc kubenswrapper[4703]: E0309 13:38:00.153267 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27161253-4d46-49d6-a665-87451a02c056" containerName="pull" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.153285 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="27161253-4d46-49d6-a665-87451a02c056" containerName="pull" Mar 09 13:38:00 crc kubenswrapper[4703]: E0309 13:38:00.153322 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27161253-4d46-49d6-a665-87451a02c056" containerName="extract" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.153332 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="27161253-4d46-49d6-a665-87451a02c056" containerName="extract" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.153656 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="27161253-4d46-49d6-a665-87451a02c056" containerName="extract" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.154409 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.157398 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.157651 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.158907 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.160997 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-s5rq7"] Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.230889 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-829cf\" (UniqueName: \"kubernetes.io/projected/a518975f-f324-4119-bb77-6f308f0e8731-kube-api-access-829cf\") pod \"auto-csr-approver-29551058-s5rq7\" (UID: \"a518975f-f324-4119-bb77-6f308f0e8731\") " pod="openshift-infra/auto-csr-approver-29551058-s5rq7" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.332832 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-829cf\" (UniqueName: \"kubernetes.io/projected/a518975f-f324-4119-bb77-6f308f0e8731-kube-api-access-829cf\") pod \"auto-csr-approver-29551058-s5rq7\" (UID: \"a518975f-f324-4119-bb77-6f308f0e8731\") " pod="openshift-infra/auto-csr-approver-29551058-s5rq7" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.363632 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-829cf\" (UniqueName: \"kubernetes.io/projected/a518975f-f324-4119-bb77-6f308f0e8731-kube-api-access-829cf\") pod \"auto-csr-approver-29551058-s5rq7\" (UID: \"a518975f-f324-4119-bb77-6f308f0e8731\") " pod="openshift-infra/auto-csr-approver-29551058-s5rq7" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.470413 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" Mar 09 13:38:00 crc kubenswrapper[4703]: I0309 13:38:00.937090 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-s5rq7"] Mar 09 13:38:01 crc kubenswrapper[4703]: I0309 13:38:01.071177 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" event={"ID":"a518975f-f324-4119-bb77-6f308f0e8731","Type":"ContainerStarted","Data":"fb30e6a52fb36beedb7ebbc6abd2a4101e21e6fd98ab01a130f3f02d98a5d047"} Mar 09 13:38:03 crc kubenswrapper[4703]: I0309 13:38:03.084597 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" event={"ID":"a518975f-f324-4119-bb77-6f308f0e8731","Type":"ContainerStarted","Data":"f73e451c2a24c80f3ff6825370999dbc9d8476b46cf261308d48d0311aee347a"} Mar 09 13:38:03 crc kubenswrapper[4703]: I0309 13:38:03.099127 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" podStartSLOduration=1.453513545 podStartE2EDuration="3.099107779s" podCreationTimestamp="2026-03-09 13:38:00 +0000 UTC" firstStartedPulling="2026-03-09 13:38:00.944653041 +0000 UTC m=+1076.912068717" lastFinishedPulling="2026-03-09 13:38:02.590247265 +0000 UTC m=+1078.557662951" observedRunningTime="2026-03-09 13:38:03.094984553 +0000 UTC m=+1079.062400239" watchObservedRunningTime="2026-03-09 13:38:03.099107779 +0000 UTC m=+1079.066523465" Mar 09 13:38:04 crc kubenswrapper[4703]: I0309 13:38:04.093393 4703 generic.go:334] "Generic (PLEG): container finished" podID="a518975f-f324-4119-bb77-6f308f0e8731" containerID="f73e451c2a24c80f3ff6825370999dbc9d8476b46cf261308d48d0311aee347a" exitCode=0 Mar 09 13:38:04 crc kubenswrapper[4703]: I0309 13:38:04.093434 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" event={"ID":"a518975f-f324-4119-bb77-6f308f0e8731","Type":"ContainerDied","Data":"f73e451c2a24c80f3ff6825370999dbc9d8476b46cf261308d48d0311aee347a"} Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.362562 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.396808 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-829cf\" (UniqueName: \"kubernetes.io/projected/a518975f-f324-4119-bb77-6f308f0e8731-kube-api-access-829cf\") pod \"a518975f-f324-4119-bb77-6f308f0e8731\" (UID: \"a518975f-f324-4119-bb77-6f308f0e8731\") " Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.407026 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a518975f-f324-4119-bb77-6f308f0e8731-kube-api-access-829cf" (OuterVolumeSpecName: "kube-api-access-829cf") pod "a518975f-f324-4119-bb77-6f308f0e8731" (UID: "a518975f-f324-4119-bb77-6f308f0e8731"). InnerVolumeSpecName "kube-api-access-829cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.497661 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-829cf\" (UniqueName: \"kubernetes.io/projected/a518975f-f324-4119-bb77-6f308f0e8731-kube-api-access-829cf\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.649093 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2"] Mar 09 13:38:05 crc kubenswrapper[4703]: E0309 13:38:05.649394 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a518975f-f324-4119-bb77-6f308f0e8731" containerName="oc" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.649419 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a518975f-f324-4119-bb77-6f308f0e8731" containerName="oc" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.649567 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a518975f-f324-4119-bb77-6f308f0e8731" containerName="oc" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.650296 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.652179 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vt7m6" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.655884 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.663560 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2"] Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.699400 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-webhook-cert\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.699465 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rtwn\" (UniqueName: \"kubernetes.io/projected/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-kube-api-access-4rtwn\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.699659 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-apiservice-cert\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.801402 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-webhook-cert\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.801541 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rtwn\" (UniqueName: \"kubernetes.io/projected/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-kube-api-access-4rtwn\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.801605 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-apiservice-cert\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.806215 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-apiservice-cert\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.811434 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-webhook-cert\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:05 crc kubenswrapper[4703]: I0309 13:38:05.820330 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rtwn\" (UniqueName: \"kubernetes.io/projected/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-kube-api-access-4rtwn\") pod \"keystone-operator-controller-manager-85f6c9db84-6shz2\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.006278 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.114273 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" event={"ID":"a518975f-f324-4119-bb77-6f308f0e8731","Type":"ContainerDied","Data":"fb30e6a52fb36beedb7ebbc6abd2a4101e21e6fd98ab01a130f3f02d98a5d047"} Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.114650 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb30e6a52fb36beedb7ebbc6abd2a4101e21e6fd98ab01a130f3f02d98a5d047" Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.114700 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551058-s5rq7" Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.181356 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-hc57s"] Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.187819 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551052-hc57s"] Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.235006 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2"] Mar 09 13:38:06 crc kubenswrapper[4703]: W0309 13:38:06.242884 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac75edd0_0cc7_42c5_b1c1_6afe38f7adef.slice/crio-25c28d3a36d9c085afa52b3d6c4e103eef8354f10b2206d36d36715d1d8a2280 WatchSource:0}: Error finding container 25c28d3a36d9c085afa52b3d6c4e103eef8354f10b2206d36d36715d1d8a2280: Status 404 returned error can't find the container with id 25c28d3a36d9c085afa52b3d6c4e103eef8354f10b2206d36d36715d1d8a2280 Mar 09 13:38:06 crc kubenswrapper[4703]: I0309 13:38:06.715518 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebaa375d-a2b4-4594-a69c-2da7cab4d396" path="/var/lib/kubelet/pods/ebaa375d-a2b4-4594-a69c-2da7cab4d396/volumes" Mar 09 13:38:07 crc kubenswrapper[4703]: I0309 13:38:07.122467 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" event={"ID":"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef","Type":"ContainerStarted","Data":"25c28d3a36d9c085afa52b3d6c4e103eef8354f10b2206d36d36715d1d8a2280"} Mar 09 13:38:11 crc kubenswrapper[4703]: I0309 13:38:11.161584 4703 generic.go:334] "Generic (PLEG): container finished" podID="379d4845-913a-4422-915c-221497738cde" containerID="b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa" exitCode=0 Mar 09 13:38:11 crc kubenswrapper[4703]: I0309 13:38:11.161718 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/rabbitmq-server-0" event={"ID":"379d4845-913a-4422-915c-221497738cde","Type":"ContainerDied","Data":"b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa"} Mar 09 13:38:11 crc kubenswrapper[4703]: I0309 13:38:11.166924 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" event={"ID":"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef","Type":"ContainerStarted","Data":"a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083"} Mar 09 13:38:11 crc kubenswrapper[4703]: I0309 13:38:11.167107 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:11 crc kubenswrapper[4703]: I0309 13:38:11.218241 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" podStartSLOduration=2.467423638 podStartE2EDuration="6.218223592s" podCreationTimestamp="2026-03-09 13:38:05 +0000 UTC" firstStartedPulling="2026-03-09 13:38:06.245227688 +0000 UTC m=+1082.212643374" lastFinishedPulling="2026-03-09 13:38:09.996027642 +0000 UTC m=+1085.963443328" observedRunningTime="2026-03-09 13:38:11.214106806 +0000 UTC m=+1087.181522532" watchObservedRunningTime="2026-03-09 13:38:11.218223592 +0000 UTC m=+1087.185639268" Mar 09 13:38:12 crc kubenswrapper[4703]: I0309 13:38:12.174662 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/rabbitmq-server-0" event={"ID":"379d4845-913a-4422-915c-221497738cde","Type":"ContainerStarted","Data":"07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1"} Mar 09 13:38:12 crc kubenswrapper[4703]: I0309 13:38:12.175174 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:38:12 crc kubenswrapper[4703]: I0309 13:38:12.210654 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.659712712 podStartE2EDuration="43.210638646s" podCreationTimestamp="2026-03-09 13:37:29 +0000 UTC" firstStartedPulling="2026-03-09 13:37:31.378961344 +0000 UTC m=+1047.346377030" lastFinishedPulling="2026-03-09 13:37:36.929887268 +0000 UTC m=+1052.897302964" observedRunningTime="2026-03-09 13:38:12.208334661 +0000 UTC m=+1088.175750347" watchObservedRunningTime="2026-03-09 13:38:12.210638646 +0000 UTC m=+1088.178054332" Mar 09 13:38:16 crc kubenswrapper[4703]: I0309 13:38:16.011831 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.828150 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/ceph"] Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.829089 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.831998 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"default-dockercfg-wczbt" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.857356 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-data\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.857411 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-run\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.857536 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-log\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.857683 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckxm7\" (UniqueName: \"kubernetes.io/projected/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-kube-api-access-ckxm7\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.958287 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckxm7\" (UniqueName: \"kubernetes.io/projected/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-kube-api-access-ckxm7\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.958563 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-data\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.958601 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-run\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.958625 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-log\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.959065 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-data\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.959641 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-run\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.959727 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-log\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:19 crc kubenswrapper[4703]: I0309 13:38:19.975606 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckxm7\" (UniqueName: \"kubernetes.io/projected/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-kube-api-access-ckxm7\") pod \"ceph\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " pod="manila-kuttl-tests/ceph" Mar 09 13:38:20 crc kubenswrapper[4703]: I0309 13:38:20.143547 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/ceph" Mar 09 13:38:20 crc kubenswrapper[4703]: W0309 13:38:20.175715 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98ec184b_eb8e_4967_b8ec_17cb6f984ccb.slice/crio-db0f196c382ce3a2ed3620a900fb64c1d159aa6da429e53dc7c624ca017f33d6 WatchSource:0}: Error finding container db0f196c382ce3a2ed3620a900fb64c1d159aa6da429e53dc7c624ca017f33d6: Status 404 returned error can't find the container with id db0f196c382ce3a2ed3620a900fb64c1d159aa6da429e53dc7c624ca017f33d6 Mar 09 13:38:20 crc kubenswrapper[4703]: I0309 13:38:20.244265 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/ceph" event={"ID":"98ec184b-eb8e-4967-b8ec-17cb6f984ccb","Type":"ContainerStarted","Data":"db0f196c382ce3a2ed3620a900fb64c1d159aa6da429e53dc7c624ca017f33d6"} Mar 09 13:38:26 crc kubenswrapper[4703]: I0309 13:38:26.549732 4703 scope.go:117] "RemoveContainer" containerID="a4e2e00e94bf933fd38a0cfe881ee74c2acf133d5bfca58901f5f4fc771e2461" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.728415 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/keystone-db-create-8bms6"] Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.730795 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.734267 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/keystone-94db-account-create-update-j46q9"] Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.735188 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.736885 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-db-secret" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.746394 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-db-create-8bms6"] Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.754966 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-94db-account-create-update-j46q9"] Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.827516 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ckt4\" (UniqueName: \"kubernetes.io/projected/fc27e659-7a64-47b8-9dd2-8b52464efc1c-kube-api-access-4ckt4\") pod \"keystone-94db-account-create-update-j46q9\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.827571 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1b4f34-e53b-4a26-be67-b31b2b248330-operator-scripts\") pod \"keystone-db-create-8bms6\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.827609 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc27e659-7a64-47b8-9dd2-8b52464efc1c-operator-scripts\") pod \"keystone-94db-account-create-update-j46q9\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.827636 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8kvc\" (UniqueName: \"kubernetes.io/projected/8c1b4f34-e53b-4a26-be67-b31b2b248330-kube-api-access-d8kvc\") pod \"keystone-db-create-8bms6\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.929363 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ckt4\" (UniqueName: \"kubernetes.io/projected/fc27e659-7a64-47b8-9dd2-8b52464efc1c-kube-api-access-4ckt4\") pod \"keystone-94db-account-create-update-j46q9\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.929411 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1b4f34-e53b-4a26-be67-b31b2b248330-operator-scripts\") pod \"keystone-db-create-8bms6\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.929459 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc27e659-7a64-47b8-9dd2-8b52464efc1c-operator-scripts\") pod \"keystone-94db-account-create-update-j46q9\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.929491 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8kvc\" (UniqueName: \"kubernetes.io/projected/8c1b4f34-e53b-4a26-be67-b31b2b248330-kube-api-access-d8kvc\") pod \"keystone-db-create-8bms6\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.930587 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1b4f34-e53b-4a26-be67-b31b2b248330-operator-scripts\") pod \"keystone-db-create-8bms6\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.931406 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc27e659-7a64-47b8-9dd2-8b52464efc1c-operator-scripts\") pod \"keystone-94db-account-create-update-j46q9\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.951007 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ckt4\" (UniqueName: \"kubernetes.io/projected/fc27e659-7a64-47b8-9dd2-8b52464efc1c-kube-api-access-4ckt4\") pod \"keystone-94db-account-create-update-j46q9\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:29 crc kubenswrapper[4703]: I0309 13:38:29.956164 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8kvc\" (UniqueName: \"kubernetes.io/projected/8c1b4f34-e53b-4a26-be67-b31b2b248330-kube-api-access-d8kvc\") pod \"keystone-db-create-8bms6\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:30 crc kubenswrapper[4703]: I0309 13:38:30.064819 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:30 crc kubenswrapper[4703]: I0309 13:38:30.075532 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:30 crc kubenswrapper[4703]: I0309 13:38:30.941796 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:38:41 crc kubenswrapper[4703]: I0309 13:38:41.144174 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-db-create-8bms6"] Mar 09 13:38:41 crc kubenswrapper[4703]: W0309 13:38:41.227121 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c1b4f34_e53b_4a26_be67_b31b2b248330.slice/crio-4ebd24451f31d53143eeb8905496f2221b3074a41e88ac01671e7c6ebebb6c43 WatchSource:0}: Error finding container 4ebd24451f31d53143eeb8905496f2221b3074a41e88ac01671e7c6ebebb6c43: Status 404 returned error can't find the container with id 4ebd24451f31d53143eeb8905496f2221b3074a41e88ac01671e7c6ebebb6c43 Mar 09 13:38:41 crc kubenswrapper[4703]: I0309 13:38:41.419224 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-db-create-8bms6" event={"ID":"8c1b4f34-e53b-4a26-be67-b31b2b248330","Type":"ContainerStarted","Data":"4ebd24451f31d53143eeb8905496f2221b3074a41e88ac01671e7c6ebebb6c43"} Mar 09 13:38:41 crc kubenswrapper[4703]: I0309 13:38:41.454119 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-94db-account-create-update-j46q9"] Mar 09 13:38:41 crc kubenswrapper[4703]: E0309 13:38:41.888879 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Mar 09 13:38:41 crc kubenswrapper[4703]: E0309 13:38:41.889044 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckxm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_manila-kuttl-tests(98ec184b-eb8e-4967-b8ec-17cb6f984ccb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:38:41 crc kubenswrapper[4703]: E0309 13:38:41.890242 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="manila-kuttl-tests/ceph" podUID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" Mar 09 13:38:42 crc kubenswrapper[4703]: I0309 13:38:42.426159 4703 generic.go:334] "Generic (PLEG): container finished" podID="fc27e659-7a64-47b8-9dd2-8b52464efc1c" containerID="f2c94226eaaa88ffc320263cbb827a94e454d6ae3a3674526fe9a6165cc33c4b" exitCode=0 Mar 09 13:38:42 crc kubenswrapper[4703]: I0309 13:38:42.426281 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" event={"ID":"fc27e659-7a64-47b8-9dd2-8b52464efc1c","Type":"ContainerDied","Data":"f2c94226eaaa88ffc320263cbb827a94e454d6ae3a3674526fe9a6165cc33c4b"} Mar 09 13:38:42 crc kubenswrapper[4703]: I0309 13:38:42.427175 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" event={"ID":"fc27e659-7a64-47b8-9dd2-8b52464efc1c","Type":"ContainerStarted","Data":"fa4a21e5ca137c2479c3b1ef7340b17d71209aec0926437071131f66acf20318"} Mar 09 13:38:42 crc kubenswrapper[4703]: I0309 13:38:42.428539 4703 generic.go:334] "Generic (PLEG): container finished" podID="8c1b4f34-e53b-4a26-be67-b31b2b248330" containerID="3e4cf567a5837002d2c0141ba0d9042475b10e8811d43c9947560db2df1b5935" exitCode=0 Mar 09 13:38:42 crc kubenswrapper[4703]: I0309 13:38:42.428622 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-db-create-8bms6" event={"ID":"8c1b4f34-e53b-4a26-be67-b31b2b248330","Type":"ContainerDied","Data":"3e4cf567a5837002d2c0141ba0d9042475b10e8811d43c9947560db2df1b5935"} Mar 09 13:38:42 crc kubenswrapper[4703]: E0309 13:38:42.430124 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="manila-kuttl-tests/ceph" podUID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.754496 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.810627 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.870110 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8kvc\" (UniqueName: \"kubernetes.io/projected/8c1b4f34-e53b-4a26-be67-b31b2b248330-kube-api-access-d8kvc\") pod \"8c1b4f34-e53b-4a26-be67-b31b2b248330\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.870261 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1b4f34-e53b-4a26-be67-b31b2b248330-operator-scripts\") pod \"8c1b4f34-e53b-4a26-be67-b31b2b248330\" (UID: \"8c1b4f34-e53b-4a26-be67-b31b2b248330\") " Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.870741 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1b4f34-e53b-4a26-be67-b31b2b248330-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c1b4f34-e53b-4a26-be67-b31b2b248330" (UID: "8c1b4f34-e53b-4a26-be67-b31b2b248330"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.879177 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1b4f34-e53b-4a26-be67-b31b2b248330-kube-api-access-d8kvc" (OuterVolumeSpecName: "kube-api-access-d8kvc") pod "8c1b4f34-e53b-4a26-be67-b31b2b248330" (UID: "8c1b4f34-e53b-4a26-be67-b31b2b248330"). InnerVolumeSpecName "kube-api-access-d8kvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.971479 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc27e659-7a64-47b8-9dd2-8b52464efc1c-operator-scripts\") pod \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.971547 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ckt4\" (UniqueName: \"kubernetes.io/projected/fc27e659-7a64-47b8-9dd2-8b52464efc1c-kube-api-access-4ckt4\") pod \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\" (UID: \"fc27e659-7a64-47b8-9dd2-8b52464efc1c\") " Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.972093 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8kvc\" (UniqueName: \"kubernetes.io/projected/8c1b4f34-e53b-4a26-be67-b31b2b248330-kube-api-access-d8kvc\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.972140 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1b4f34-e53b-4a26-be67-b31b2b248330-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.972090 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc27e659-7a64-47b8-9dd2-8b52464efc1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc27e659-7a64-47b8-9dd2-8b52464efc1c" (UID: "fc27e659-7a64-47b8-9dd2-8b52464efc1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:38:43 crc kubenswrapper[4703]: I0309 13:38:43.975075 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc27e659-7a64-47b8-9dd2-8b52464efc1c-kube-api-access-4ckt4" (OuterVolumeSpecName: "kube-api-access-4ckt4") pod "fc27e659-7a64-47b8-9dd2-8b52464efc1c" (UID: "fc27e659-7a64-47b8-9dd2-8b52464efc1c"). InnerVolumeSpecName "kube-api-access-4ckt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.073749 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc27e659-7a64-47b8-9dd2-8b52464efc1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.073785 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ckt4\" (UniqueName: \"kubernetes.io/projected/fc27e659-7a64-47b8-9dd2-8b52464efc1c-kube-api-access-4ckt4\") on node \"crc\" DevicePath \"\"" Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.445262 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-db-create-8bms6" event={"ID":"8c1b4f34-e53b-4a26-be67-b31b2b248330","Type":"ContainerDied","Data":"4ebd24451f31d53143eeb8905496f2221b3074a41e88ac01671e7c6ebebb6c43"} Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.445331 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ebd24451f31d53143eeb8905496f2221b3074a41e88ac01671e7c6ebebb6c43" Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.445330 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-create-8bms6" Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.447088 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" event={"ID":"fc27e659-7a64-47b8-9dd2-8b52464efc1c","Type":"ContainerDied","Data":"fa4a21e5ca137c2479c3b1ef7340b17d71209aec0926437071131f66acf20318"} Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.447129 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa4a21e5ca137c2479c3b1ef7340b17d71209aec0926437071131f66acf20318" Mar 09 13:38:44 crc kubenswrapper[4703]: I0309 13:38:44.447150 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-94db-account-create-update-j46q9" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.569061 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/keystone-db-sync-699rj"] Mar 09 13:38:50 crc kubenswrapper[4703]: E0309 13:38:50.570119 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc27e659-7a64-47b8-9dd2-8b52464efc1c" containerName="mariadb-account-create-update" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.570142 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc27e659-7a64-47b8-9dd2-8b52464efc1c" containerName="mariadb-account-create-update" Mar 09 13:38:50 crc kubenswrapper[4703]: E0309 13:38:50.570174 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1b4f34-e53b-4a26-be67-b31b2b248330" containerName="mariadb-database-create" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.570189 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1b4f34-e53b-4a26-be67-b31b2b248330" containerName="mariadb-database-create" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.570384 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc27e659-7a64-47b8-9dd2-8b52464efc1c" containerName="mariadb-account-create-update" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.570403 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1b4f34-e53b-4a26-be67-b31b2b248330" containerName="mariadb-database-create" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.571083 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.577097 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-keystone-dockercfg-bzvs2" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.577144 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.577097 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-scripts" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.577097 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-config-data" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.581744 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-db-sync-699rj"] Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.671666 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e32ebb9-2351-4afc-9313-5ad8d978bed8-config-data\") pod \"keystone-db-sync-699rj\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.671717 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf45r\" (UniqueName: \"kubernetes.io/projected/0e32ebb9-2351-4afc-9313-5ad8d978bed8-kube-api-access-pf45r\") pod \"keystone-db-sync-699rj\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.773399 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e32ebb9-2351-4afc-9313-5ad8d978bed8-config-data\") pod \"keystone-db-sync-699rj\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.773494 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf45r\" (UniqueName: \"kubernetes.io/projected/0e32ebb9-2351-4afc-9313-5ad8d978bed8-kube-api-access-pf45r\") pod \"keystone-db-sync-699rj\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.783388 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e32ebb9-2351-4afc-9313-5ad8d978bed8-config-data\") pod \"keystone-db-sync-699rj\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.792203 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf45r\" (UniqueName: \"kubernetes.io/projected/0e32ebb9-2351-4afc-9313-5ad8d978bed8-kube-api-access-pf45r\") pod \"keystone-db-sync-699rj\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:50 crc kubenswrapper[4703]: I0309 13:38:50.898023 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:38:51 crc kubenswrapper[4703]: I0309 13:38:51.374420 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-db-sync-699rj"] Mar 09 13:38:51 crc kubenswrapper[4703]: W0309 13:38:51.382398 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e32ebb9_2351_4afc_9313_5ad8d978bed8.slice/crio-18cf3e02a8d07af0e91e94e143a71f90d137020bb2e22f39c4ca7f27666d8bd5 WatchSource:0}: Error finding container 18cf3e02a8d07af0e91e94e143a71f90d137020bb2e22f39c4ca7f27666d8bd5: Status 404 returned error can't find the container with id 18cf3e02a8d07af0e91e94e143a71f90d137020bb2e22f39c4ca7f27666d8bd5 Mar 09 13:38:51 crc kubenswrapper[4703]: I0309 13:38:51.513201 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-db-sync-699rj" event={"ID":"0e32ebb9-2351-4afc-9313-5ad8d978bed8","Type":"ContainerStarted","Data":"18cf3e02a8d07af0e91e94e143a71f90d137020bb2e22f39c4ca7f27666d8bd5"} Mar 09 13:39:04 crc kubenswrapper[4703]: E0309 13:39:04.546605 4703 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Mar 09 13:39:04 crc kubenswrapper[4703]: E0309 13:39:04.547454 4703 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pf45r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-699rj_manila-kuttl-tests(0e32ebb9-2351-4afc-9313-5ad8d978bed8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 13:39:04 crc kubenswrapper[4703]: E0309 13:39:04.548723 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="manila-kuttl-tests/keystone-db-sync-699rj" podUID="0e32ebb9-2351-4afc-9313-5ad8d978bed8" Mar 09 13:39:04 crc kubenswrapper[4703]: E0309 13:39:04.877648 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="manila-kuttl-tests/keystone-db-sync-699rj" podUID="0e32ebb9-2351-4afc-9313-5ad8d978bed8" Mar 09 13:39:05 crc kubenswrapper[4703]: I0309 13:39:05.637252 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/ceph" event={"ID":"98ec184b-eb8e-4967-b8ec-17cb6f984ccb","Type":"ContainerStarted","Data":"d7b39f5cef1fc23c2e91878c4692c968d24bdb7697cd879ccc1febd9f9cad2aa"} Mar 09 13:39:05 crc kubenswrapper[4703]: I0309 13:39:05.663829 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/ceph" podStartSLOduration=1.964438388 podStartE2EDuration="46.663809651s" podCreationTimestamp="2026-03-09 13:38:19 +0000 UTC" firstStartedPulling="2026-03-09 13:38:20.182320162 +0000 UTC m=+1096.149735848" lastFinishedPulling="2026-03-09 13:39:04.881691405 +0000 UTC m=+1140.849107111" observedRunningTime="2026-03-09 13:39:05.663461221 +0000 UTC m=+1141.630876917" watchObservedRunningTime="2026-03-09 13:39:05.663809651 +0000 UTC m=+1141.631225347" Mar 09 13:39:09 crc kubenswrapper[4703]: I0309 13:39:09.500436 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:39:09 crc kubenswrapper[4703]: I0309 13:39:09.501151 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:39:15 crc kubenswrapper[4703]: E0309 13:39:15.668209 4703 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.19:51032->38.129.56.19:44129: write tcp 38.129.56.19:51032->38.129.56.19:44129: write: broken pipe Mar 09 13:39:18 crc kubenswrapper[4703]: I0309 13:39:18.759292 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-db-sync-699rj" event={"ID":"0e32ebb9-2351-4afc-9313-5ad8d978bed8","Type":"ContainerStarted","Data":"aa10110b3158b4476c3eae66a3096c4fbd3eeff75ac04dbf4d268f617988039c"} Mar 09 13:39:18 crc kubenswrapper[4703]: I0309 13:39:18.788012 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/keystone-db-sync-699rj" podStartSLOduration=1.9714524450000002 podStartE2EDuration="28.787987302s" podCreationTimestamp="2026-03-09 13:38:50 +0000 UTC" firstStartedPulling="2026-03-09 13:38:51.383826341 +0000 UTC m=+1127.351242067" lastFinishedPulling="2026-03-09 13:39:18.200361238 +0000 UTC m=+1154.167776924" observedRunningTime="2026-03-09 13:39:18.777503756 +0000 UTC m=+1154.744919462" watchObservedRunningTime="2026-03-09 13:39:18.787987302 +0000 UTC m=+1154.755403028" Mar 09 13:39:21 crc kubenswrapper[4703]: I0309 13:39:21.783007 4703 generic.go:334] "Generic (PLEG): container finished" podID="0e32ebb9-2351-4afc-9313-5ad8d978bed8" containerID="aa10110b3158b4476c3eae66a3096c4fbd3eeff75ac04dbf4d268f617988039c" exitCode=0 Mar 09 13:39:21 crc kubenswrapper[4703]: I0309 13:39:21.783073 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-db-sync-699rj" event={"ID":"0e32ebb9-2351-4afc-9313-5ad8d978bed8","Type":"ContainerDied","Data":"aa10110b3158b4476c3eae66a3096c4fbd3eeff75ac04dbf4d268f617988039c"} Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.024659 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.188087 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf45r\" (UniqueName: \"kubernetes.io/projected/0e32ebb9-2351-4afc-9313-5ad8d978bed8-kube-api-access-pf45r\") pod \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.188432 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e32ebb9-2351-4afc-9313-5ad8d978bed8-config-data\") pod \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\" (UID: \"0e32ebb9-2351-4afc-9313-5ad8d978bed8\") " Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.202511 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e32ebb9-2351-4afc-9313-5ad8d978bed8-kube-api-access-pf45r" (OuterVolumeSpecName: "kube-api-access-pf45r") pod "0e32ebb9-2351-4afc-9313-5ad8d978bed8" (UID: "0e32ebb9-2351-4afc-9313-5ad8d978bed8"). InnerVolumeSpecName "kube-api-access-pf45r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.225885 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e32ebb9-2351-4afc-9313-5ad8d978bed8-config-data" (OuterVolumeSpecName: "config-data") pod "0e32ebb9-2351-4afc-9313-5ad8d978bed8" (UID: "0e32ebb9-2351-4afc-9313-5ad8d978bed8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.290045 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf45r\" (UniqueName: \"kubernetes.io/projected/0e32ebb9-2351-4afc-9313-5ad8d978bed8-kube-api-access-pf45r\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.290074 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e32ebb9-2351-4afc-9313-5ad8d978bed8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.825204 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-db-sync-699rj" event={"ID":"0e32ebb9-2351-4afc-9313-5ad8d978bed8","Type":"ContainerDied","Data":"18cf3e02a8d07af0e91e94e143a71f90d137020bb2e22f39c4ca7f27666d8bd5"} Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.825281 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18cf3e02a8d07af0e91e94e143a71f90d137020bb2e22f39c4ca7f27666d8bd5" Mar 09 13:39:23 crc kubenswrapper[4703]: I0309 13:39:23.825280 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-db-sync-699rj" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.024552 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/keystone-bootstrap-km8qk"] Mar 09 13:39:24 crc kubenswrapper[4703]: E0309 13:39:24.024858 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e32ebb9-2351-4afc-9313-5ad8d978bed8" containerName="keystone-db-sync" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.024878 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e32ebb9-2351-4afc-9313-5ad8d978bed8" containerName="keystone-db-sync" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.025021 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e32ebb9-2351-4afc-9313-5ad8d978bed8" containerName="keystone-db-sync" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.025480 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.027119 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-config-data" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.027204 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"osp-secret" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.027488 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-keystone-dockercfg-bzvs2" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.027556 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-scripts" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.028278 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.035585 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-bootstrap-km8qk"] Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.222385 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-scripts\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.222446 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-fernet-keys\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.222499 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-credential-keys\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.222589 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89bq8\" (UniqueName: \"kubernetes.io/projected/9a287bcb-ad78-4309-a44d-deb8e59370e7-kube-api-access-89bq8\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.222686 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-config-data\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.323798 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-credential-keys\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.323923 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89bq8\" (UniqueName: \"kubernetes.io/projected/9a287bcb-ad78-4309-a44d-deb8e59370e7-kube-api-access-89bq8\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.324025 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-config-data\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.324090 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-scripts\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.324138 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-fernet-keys\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.330544 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-scripts\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.331311 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-credential-keys\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.331374 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-fernet-keys\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.333322 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-config-data\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.348708 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89bq8\" (UniqueName: \"kubernetes.io/projected/9a287bcb-ad78-4309-a44d-deb8e59370e7-kube-api-access-89bq8\") pod \"keystone-bootstrap-km8qk\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.644932 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:24 crc kubenswrapper[4703]: I0309 13:39:24.871153 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-bootstrap-km8qk"] Mar 09 13:39:25 crc kubenswrapper[4703]: I0309 13:39:25.842514 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" event={"ID":"9a287bcb-ad78-4309-a44d-deb8e59370e7","Type":"ContainerStarted","Data":"ca476d16536191b77b407c1593f9e7ea379047a19311312edce7d6f3805e8311"} Mar 09 13:39:25 crc kubenswrapper[4703]: I0309 13:39:25.842923 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" event={"ID":"9a287bcb-ad78-4309-a44d-deb8e59370e7","Type":"ContainerStarted","Data":"6be16499d24c573f8712b42d30c894ad7b01c643f6849485c0dc983712a2a158"} Mar 09 13:39:25 crc kubenswrapper[4703]: I0309 13:39:25.864258 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" podStartSLOduration=2.86423697 podStartE2EDuration="2.86423697s" podCreationTimestamp="2026-03-09 13:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:25.861604225 +0000 UTC m=+1161.829019931" watchObservedRunningTime="2026-03-09 13:39:25.86423697 +0000 UTC m=+1161.831652676" Mar 09 13:39:27 crc kubenswrapper[4703]: I0309 13:39:27.860528 4703 generic.go:334] "Generic (PLEG): container finished" podID="9a287bcb-ad78-4309-a44d-deb8e59370e7" containerID="ca476d16536191b77b407c1593f9e7ea379047a19311312edce7d6f3805e8311" exitCode=0 Mar 09 13:39:27 crc kubenswrapper[4703]: I0309 13:39:27.860630 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" event={"ID":"9a287bcb-ad78-4309-a44d-deb8e59370e7","Type":"ContainerDied","Data":"ca476d16536191b77b407c1593f9e7ea379047a19311312edce7d6f3805e8311"} Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.121017 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.302262 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-credential-keys\") pod \"9a287bcb-ad78-4309-a44d-deb8e59370e7\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.302310 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-fernet-keys\") pod \"9a287bcb-ad78-4309-a44d-deb8e59370e7\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.302418 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-config-data\") pod \"9a287bcb-ad78-4309-a44d-deb8e59370e7\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.302440 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89bq8\" (UniqueName: \"kubernetes.io/projected/9a287bcb-ad78-4309-a44d-deb8e59370e7-kube-api-access-89bq8\") pod \"9a287bcb-ad78-4309-a44d-deb8e59370e7\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.302484 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-scripts\") pod \"9a287bcb-ad78-4309-a44d-deb8e59370e7\" (UID: \"9a287bcb-ad78-4309-a44d-deb8e59370e7\") " Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.308432 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9a287bcb-ad78-4309-a44d-deb8e59370e7" (UID: "9a287bcb-ad78-4309-a44d-deb8e59370e7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.308444 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a287bcb-ad78-4309-a44d-deb8e59370e7-kube-api-access-89bq8" (OuterVolumeSpecName: "kube-api-access-89bq8") pod "9a287bcb-ad78-4309-a44d-deb8e59370e7" (UID: "9a287bcb-ad78-4309-a44d-deb8e59370e7"). InnerVolumeSpecName "kube-api-access-89bq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.308486 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-scripts" (OuterVolumeSpecName: "scripts") pod "9a287bcb-ad78-4309-a44d-deb8e59370e7" (UID: "9a287bcb-ad78-4309-a44d-deb8e59370e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.309998 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9a287bcb-ad78-4309-a44d-deb8e59370e7" (UID: "9a287bcb-ad78-4309-a44d-deb8e59370e7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.326719 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-config-data" (OuterVolumeSpecName: "config-data") pod "9a287bcb-ad78-4309-a44d-deb8e59370e7" (UID: "9a287bcb-ad78-4309-a44d-deb8e59370e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.403646 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.403682 4703 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.403691 4703 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.403700 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a287bcb-ad78-4309-a44d-deb8e59370e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.403710 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89bq8\" (UniqueName: \"kubernetes.io/projected/9a287bcb-ad78-4309-a44d-deb8e59370e7-kube-api-access-89bq8\") on node \"crc\" DevicePath \"\"" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.879411 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" event={"ID":"9a287bcb-ad78-4309-a44d-deb8e59370e7","Type":"ContainerDied","Data":"6be16499d24c573f8712b42d30c894ad7b01c643f6849485c0dc983712a2a158"} Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.879473 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be16499d24c573f8712b42d30c894ad7b01c643f6849485c0dc983712a2a158" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.879913 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-bootstrap-km8qk" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.980989 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/keystone-5b95c486b5-lnbh5"] Mar 09 13:39:29 crc kubenswrapper[4703]: E0309 13:39:29.981307 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a287bcb-ad78-4309-a44d-deb8e59370e7" containerName="keystone-bootstrap" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.981326 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a287bcb-ad78-4309-a44d-deb8e59370e7" containerName="keystone-bootstrap" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.981472 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a287bcb-ad78-4309-a44d-deb8e59370e7" containerName="keystone-bootstrap" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.981973 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.986132 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-keystone-dockercfg-bzvs2" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.986365 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.986580 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-scripts" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.986798 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"keystone-config-data" Mar 09 13:39:29 crc kubenswrapper[4703]: I0309 13:39:29.995514 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-5b95c486b5-lnbh5"] Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.013098 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvmsl\" (UniqueName: \"kubernetes.io/projected/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-kube-api-access-lvmsl\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.013146 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-scripts\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.013282 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-config-data\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.013305 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-credential-keys\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.013343 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-fernet-keys\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.114446 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-config-data\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.114490 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-credential-keys\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.114519 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-fernet-keys\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.114550 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvmsl\" (UniqueName: \"kubernetes.io/projected/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-kube-api-access-lvmsl\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.114571 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-scripts\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.120424 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-credential-keys\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.121020 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-scripts\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.121617 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-fernet-keys\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.131653 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-config-data\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.145210 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvmsl\" (UniqueName: \"kubernetes.io/projected/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-kube-api-access-lvmsl\") pod \"keystone-5b95c486b5-lnbh5\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.325418 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.627841 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone-5b95c486b5-lnbh5"] Mar 09 13:39:30 crc kubenswrapper[4703]: W0309 13:39:30.632028 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9ac0ae5_2538_48ef_bd64_e9887d90ff39.slice/crio-6da66cdd0c3a51d41a6f98989457a7bbb686fab0ecb381ae740ab69453534247 WatchSource:0}: Error finding container 6da66cdd0c3a51d41a6f98989457a7bbb686fab0ecb381ae740ab69453534247: Status 404 returned error can't find the container with id 6da66cdd0c3a51d41a6f98989457a7bbb686fab0ecb381ae740ab69453534247 Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.890457 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" event={"ID":"a9ac0ae5-2538-48ef-bd64-e9887d90ff39","Type":"ContainerStarted","Data":"b73448b46df1614f1432c93f8e29c57755391564c9ee9301fe2e705e381a034a"} Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.890869 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.890884 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" event={"ID":"a9ac0ae5-2538-48ef-bd64-e9887d90ff39","Type":"ContainerStarted","Data":"6da66cdd0c3a51d41a6f98989457a7bbb686fab0ecb381ae740ab69453534247"} Mar 09 13:39:30 crc kubenswrapper[4703]: I0309 13:39:30.913180 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" podStartSLOduration=1.913152491 podStartE2EDuration="1.913152491s" podCreationTimestamp="2026-03-09 13:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:39:30.909723584 +0000 UTC m=+1166.877139320" watchObservedRunningTime="2026-03-09 13:39:30.913152491 +0000 UTC m=+1166.880568197" Mar 09 13:39:39 crc kubenswrapper[4703]: I0309 13:39:39.500561 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:39:39 crc kubenswrapper[4703]: I0309 13:39:39.501483 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:39:41 crc kubenswrapper[4703]: E0309 13:39:41.071908 4703 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.19:48724->38.129.56.19:44129: write tcp 38.129.56.19:48724->38.129.56.19:44129: write: broken pipe Mar 09 13:39:57 crc kubenswrapper[4703]: I0309 13:39:57.850335 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-index-x7vb7"] Mar 09 13:39:57 crc kubenswrapper[4703]: I0309 13:39:57.851705 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-x7vb7" Mar 09 13:39:57 crc kubenswrapper[4703]: I0309 13:39:57.853979 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-index-dockercfg-6hg8d" Mar 09 13:39:57 crc kubenswrapper[4703]: I0309 13:39:57.864677 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-index-x7vb7"] Mar 09 13:39:57 crc kubenswrapper[4703]: I0309 13:39:57.922303 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnstz\" (UniqueName: \"kubernetes.io/projected/2ec1ce09-e145-4077-90a6-39e531280898-kube-api-access-cnstz\") pod \"manila-operator-index-x7vb7\" (UID: \"2ec1ce09-e145-4077-90a6-39e531280898\") " pod="openstack-operators/manila-operator-index-x7vb7" Mar 09 13:39:58 crc kubenswrapper[4703]: I0309 13:39:58.023923 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnstz\" (UniqueName: \"kubernetes.io/projected/2ec1ce09-e145-4077-90a6-39e531280898-kube-api-access-cnstz\") pod \"manila-operator-index-x7vb7\" (UID: \"2ec1ce09-e145-4077-90a6-39e531280898\") " pod="openstack-operators/manila-operator-index-x7vb7" Mar 09 13:39:58 crc kubenswrapper[4703]: I0309 13:39:58.044476 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnstz\" (UniqueName: \"kubernetes.io/projected/2ec1ce09-e145-4077-90a6-39e531280898-kube-api-access-cnstz\") pod \"manila-operator-index-x7vb7\" (UID: \"2ec1ce09-e145-4077-90a6-39e531280898\") " pod="openstack-operators/manila-operator-index-x7vb7" Mar 09 13:39:58 crc kubenswrapper[4703]: I0309 13:39:58.173171 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-x7vb7" Mar 09 13:39:58 crc kubenswrapper[4703]: I0309 13:39:58.449890 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-index-x7vb7"] Mar 09 13:39:59 crc kubenswrapper[4703]: I0309 13:39:59.126090 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-x7vb7" event={"ID":"2ec1ce09-e145-4077-90a6-39e531280898","Type":"ContainerStarted","Data":"439672abbd2fcda97f79a6ba25d71035bb40b06de3421df530b10dd6bc7ebcd8"} Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.132161 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551060-27x8b"] Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.133497 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-27x8b" Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.139335 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.139532 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.142282 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.146715 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-27x8b"] Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.257740 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2jtm\" (UniqueName: \"kubernetes.io/projected/86115e2b-93f7-439b-99e8-59baa25f2491-kube-api-access-z2jtm\") pod \"auto-csr-approver-29551060-27x8b\" (UID: \"86115e2b-93f7-439b-99e8-59baa25f2491\") " pod="openshift-infra/auto-csr-approver-29551060-27x8b" Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.359296 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2jtm\" (UniqueName: \"kubernetes.io/projected/86115e2b-93f7-439b-99e8-59baa25f2491-kube-api-access-z2jtm\") pod \"auto-csr-approver-29551060-27x8b\" (UID: \"86115e2b-93f7-439b-99e8-59baa25f2491\") " pod="openshift-infra/auto-csr-approver-29551060-27x8b" Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.382238 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2jtm\" (UniqueName: \"kubernetes.io/projected/86115e2b-93f7-439b-99e8-59baa25f2491-kube-api-access-z2jtm\") pod \"auto-csr-approver-29551060-27x8b\" (UID: \"86115e2b-93f7-439b-99e8-59baa25f2491\") " pod="openshift-infra/auto-csr-approver-29551060-27x8b" Mar 09 13:40:00 crc kubenswrapper[4703]: I0309 13:40:00.467221 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-27x8b" Mar 09 13:40:01 crc kubenswrapper[4703]: I0309 13:40:01.164944 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-x7vb7" event={"ID":"2ec1ce09-e145-4077-90a6-39e531280898","Type":"ContainerStarted","Data":"07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f"} Mar 09 13:40:01 crc kubenswrapper[4703]: I0309 13:40:01.184885 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-index-x7vb7" podStartSLOduration=1.813923631 podStartE2EDuration="4.18485378s" podCreationTimestamp="2026-03-09 13:39:57 +0000 UTC" firstStartedPulling="2026-03-09 13:39:58.465585659 +0000 UTC m=+1194.433001355" lastFinishedPulling="2026-03-09 13:40:00.836515818 +0000 UTC m=+1196.803931504" observedRunningTime="2026-03-09 13:40:01.17776874 +0000 UTC m=+1197.145184426" watchObservedRunningTime="2026-03-09 13:40:01.18485378 +0000 UTC m=+1197.152269466" Mar 09 13:40:01 crc kubenswrapper[4703]: I0309 13:40:01.250283 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-27x8b"] Mar 09 13:40:01 crc kubenswrapper[4703]: W0309 13:40:01.257031 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86115e2b_93f7_439b_99e8_59baa25f2491.slice/crio-4a4a02201f70c6d5744abee6aaa44652f4d2469e36c0a62a331bbd761c259c4b WatchSource:0}: Error finding container 4a4a02201f70c6d5744abee6aaa44652f4d2469e36c0a62a331bbd761c259c4b: Status 404 returned error can't find the container with id 4a4a02201f70c6d5744abee6aaa44652f4d2469e36c0a62a331bbd761c259c4b Mar 09 13:40:01 crc kubenswrapper[4703]: I0309 13:40:01.731038 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.042154 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-index-x7vb7"] Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.178114 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-27x8b" event={"ID":"86115e2b-93f7-439b-99e8-59baa25f2491","Type":"ContainerStarted","Data":"4a4a02201f70c6d5744abee6aaa44652f4d2469e36c0a62a331bbd761c259c4b"} Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.660693 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-index-nldr4"] Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.661823 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.685563 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-index-nldr4"] Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.691133 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2kv\" (UniqueName: \"kubernetes.io/projected/029a9f28-5caa-45af-872a-59de8aa8cdbe-kube-api-access-2r2kv\") pod \"manila-operator-index-nldr4\" (UID: \"029a9f28-5caa-45af-872a-59de8aa8cdbe\") " pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.793299 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2kv\" (UniqueName: \"kubernetes.io/projected/029a9f28-5caa-45af-872a-59de8aa8cdbe-kube-api-access-2r2kv\") pod \"manila-operator-index-nldr4\" (UID: \"029a9f28-5caa-45af-872a-59de8aa8cdbe\") " pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.814085 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2kv\" (UniqueName: \"kubernetes.io/projected/029a9f28-5caa-45af-872a-59de8aa8cdbe-kube-api-access-2r2kv\") pod \"manila-operator-index-nldr4\" (UID: \"029a9f28-5caa-45af-872a-59de8aa8cdbe\") " pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:02 crc kubenswrapper[4703]: I0309 13:40:02.981285 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:03 crc kubenswrapper[4703]: I0309 13:40:03.187050 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/manila-operator-index-x7vb7" podUID="2ec1ce09-e145-4077-90a6-39e531280898" containerName="registry-server" containerID="cri-o://07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f" gracePeriod=2 Mar 09 13:40:03 crc kubenswrapper[4703]: I0309 13:40:03.273956 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-index-nldr4"] Mar 09 13:40:03 crc kubenswrapper[4703]: W0309 13:40:03.274529 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029a9f28_5caa_45af_872a_59de8aa8cdbe.slice/crio-faef0dd36467d07c3f6c9959d276eaa9c8990a234d7d334b88beef7444e1e97e WatchSource:0}: Error finding container faef0dd36467d07c3f6c9959d276eaa9c8990a234d7d334b88beef7444e1e97e: Status 404 returned error can't find the container with id faef0dd36467d07c3f6c9959d276eaa9c8990a234d7d334b88beef7444e1e97e Mar 09 13:40:03 crc kubenswrapper[4703]: I0309 13:40:03.522325 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-x7vb7" Mar 09 13:40:03 crc kubenswrapper[4703]: I0309 13:40:03.603248 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnstz\" (UniqueName: \"kubernetes.io/projected/2ec1ce09-e145-4077-90a6-39e531280898-kube-api-access-cnstz\") pod \"2ec1ce09-e145-4077-90a6-39e531280898\" (UID: \"2ec1ce09-e145-4077-90a6-39e531280898\") " Mar 09 13:40:03 crc kubenswrapper[4703]: I0309 13:40:03.609902 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec1ce09-e145-4077-90a6-39e531280898-kube-api-access-cnstz" (OuterVolumeSpecName: "kube-api-access-cnstz") pod "2ec1ce09-e145-4077-90a6-39e531280898" (UID: "2ec1ce09-e145-4077-90a6-39e531280898"). InnerVolumeSpecName "kube-api-access-cnstz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:03 crc kubenswrapper[4703]: I0309 13:40:03.705034 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnstz\" (UniqueName: \"kubernetes.io/projected/2ec1ce09-e145-4077-90a6-39e531280898-kube-api-access-cnstz\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.195913 4703 generic.go:334] "Generic (PLEG): container finished" podID="2ec1ce09-e145-4077-90a6-39e531280898" containerID="07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f" exitCode=0 Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.195992 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-x7vb7" event={"ID":"2ec1ce09-e145-4077-90a6-39e531280898","Type":"ContainerDied","Data":"07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f"} Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.196023 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-x7vb7" Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.196059 4703 scope.go:117] "RemoveContainer" containerID="07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f" Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.196045 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-x7vb7" event={"ID":"2ec1ce09-e145-4077-90a6-39e531280898","Type":"ContainerDied","Data":"439672abbd2fcda97f79a6ba25d71035bb40b06de3421df530b10dd6bc7ebcd8"} Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.198317 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-nldr4" event={"ID":"029a9f28-5caa-45af-872a-59de8aa8cdbe","Type":"ContainerStarted","Data":"8231820edd08e43e7a8b081acd5f16398c424a1f1e8cc3f009892d03b5089780"} Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.198348 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-nldr4" event={"ID":"029a9f28-5caa-45af-872a-59de8aa8cdbe","Type":"ContainerStarted","Data":"faef0dd36467d07c3f6c9959d276eaa9c8990a234d7d334b88beef7444e1e97e"} Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.200979 4703 generic.go:334] "Generic (PLEG): container finished" podID="86115e2b-93f7-439b-99e8-59baa25f2491" containerID="46be58143c5655ade22fd8aac3f086bdb5cb8d1805cb0bb507632272e93f947a" exitCode=0 Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.201026 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-27x8b" event={"ID":"86115e2b-93f7-439b-99e8-59baa25f2491","Type":"ContainerDied","Data":"46be58143c5655ade22fd8aac3f086bdb5cb8d1805cb0bb507632272e93f947a"} Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.232723 4703 scope.go:117] "RemoveContainer" containerID="07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f" Mar 09 13:40:04 crc kubenswrapper[4703]: E0309 13:40:04.233490 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f\": container with ID starting with 07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f not found: ID does not exist" containerID="07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f" Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.233578 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f"} err="failed to get container status \"07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f\": rpc error: code = NotFound desc = could not find container \"07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f\": container with ID starting with 07056f36f4314355aeb4e9d338f5a9ae7fb9bfdf566107911c307593f98dcc0f not found: ID does not exist" Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.258775 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-index-nldr4" podStartSLOduration=2.216752107 podStartE2EDuration="2.25875257s" podCreationTimestamp="2026-03-09 13:40:02 +0000 UTC" firstStartedPulling="2026-03-09 13:40:03.277918115 +0000 UTC m=+1199.245333801" lastFinishedPulling="2026-03-09 13:40:03.319918578 +0000 UTC m=+1199.287334264" observedRunningTime="2026-03-09 13:40:04.247215055 +0000 UTC m=+1200.214630761" watchObservedRunningTime="2026-03-09 13:40:04.25875257 +0000 UTC m=+1200.226168276" Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.279615 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-index-x7vb7"] Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.288487 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/manila-operator-index-x7vb7"] Mar 09 13:40:04 crc kubenswrapper[4703]: I0309 13:40:04.723587 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec1ce09-e145-4077-90a6-39e531280898" path="/var/lib/kubelet/pods/2ec1ce09-e145-4077-90a6-39e531280898/volumes" Mar 09 13:40:05 crc kubenswrapper[4703]: I0309 13:40:05.487623 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-27x8b" Mar 09 13:40:05 crc kubenswrapper[4703]: I0309 13:40:05.532615 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2jtm\" (UniqueName: \"kubernetes.io/projected/86115e2b-93f7-439b-99e8-59baa25f2491-kube-api-access-z2jtm\") pod \"86115e2b-93f7-439b-99e8-59baa25f2491\" (UID: \"86115e2b-93f7-439b-99e8-59baa25f2491\") " Mar 09 13:40:05 crc kubenswrapper[4703]: I0309 13:40:05.538207 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86115e2b-93f7-439b-99e8-59baa25f2491-kube-api-access-z2jtm" (OuterVolumeSpecName: "kube-api-access-z2jtm") pod "86115e2b-93f7-439b-99e8-59baa25f2491" (UID: "86115e2b-93f7-439b-99e8-59baa25f2491"). InnerVolumeSpecName "kube-api-access-z2jtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:05 crc kubenswrapper[4703]: I0309 13:40:05.635130 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2jtm\" (UniqueName: \"kubernetes.io/projected/86115e2b-93f7-439b-99e8-59baa25f2491-kube-api-access-z2jtm\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:06 crc kubenswrapper[4703]: I0309 13:40:06.213162 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551060-27x8b" event={"ID":"86115e2b-93f7-439b-99e8-59baa25f2491","Type":"ContainerDied","Data":"4a4a02201f70c6d5744abee6aaa44652f4d2469e36c0a62a331bbd761c259c4b"} Mar 09 13:40:06 crc kubenswrapper[4703]: I0309 13:40:06.213405 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a4a02201f70c6d5744abee6aaa44652f4d2469e36c0a62a331bbd761c259c4b" Mar 09 13:40:06 crc kubenswrapper[4703]: I0309 13:40:06.213404 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551060-27x8b" Mar 09 13:40:06 crc kubenswrapper[4703]: I0309 13:40:06.537243 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-nt48w"] Mar 09 13:40:06 crc kubenswrapper[4703]: I0309 13:40:06.542368 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551054-nt48w"] Mar 09 13:40:06 crc kubenswrapper[4703]: I0309 13:40:06.715247 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb09a646-e855-46a3-8091-333d81ef7c8f" path="/var/lib/kubelet/pods/bb09a646-e855-46a3-8091-333d81ef7c8f/volumes" Mar 09 13:40:09 crc kubenswrapper[4703]: I0309 13:40:09.499957 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:40:09 crc kubenswrapper[4703]: I0309 13:40:09.501617 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:40:09 crc kubenswrapper[4703]: I0309 13:40:09.501964 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:40:09 crc kubenswrapper[4703]: I0309 13:40:09.502984 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10cd3ad358458f317f93eedfe061a2094205d9a36cfaa6732883cb518fdce4bc"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:40:09 crc kubenswrapper[4703]: I0309 13:40:09.503230 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://10cd3ad358458f317f93eedfe061a2094205d9a36cfaa6732883cb518fdce4bc" gracePeriod=600 Mar 09 13:40:10 crc kubenswrapper[4703]: I0309 13:40:10.244462 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="10cd3ad358458f317f93eedfe061a2094205d9a36cfaa6732883cb518fdce4bc" exitCode=0 Mar 09 13:40:10 crc kubenswrapper[4703]: I0309 13:40:10.244551 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"10cd3ad358458f317f93eedfe061a2094205d9a36cfaa6732883cb518fdce4bc"} Mar 09 13:40:10 crc kubenswrapper[4703]: I0309 13:40:10.244921 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"63593ee12ca38b373f82a991b81f46458f13b7e7d0525d9d0305c5a08fc34572"} Mar 09 13:40:10 crc kubenswrapper[4703]: I0309 13:40:10.244951 4703 scope.go:117] "RemoveContainer" containerID="5f24d1f6f12dd9f2dbe4d447d5f4ec4a161023da53dbcd799df4861c45de4b0c" Mar 09 13:40:12 crc kubenswrapper[4703]: I0309 13:40:12.982139 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:12 crc kubenswrapper[4703]: I0309 13:40:12.983072 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:13 crc kubenswrapper[4703]: I0309 13:40:13.013367 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:13 crc kubenswrapper[4703]: I0309 13:40:13.315146 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.287659 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b"] Mar 09 13:40:25 crc kubenswrapper[4703]: E0309 13:40:25.288596 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec1ce09-e145-4077-90a6-39e531280898" containerName="registry-server" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.288614 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec1ce09-e145-4077-90a6-39e531280898" containerName="registry-server" Mar 09 13:40:25 crc kubenswrapper[4703]: E0309 13:40:25.288633 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86115e2b-93f7-439b-99e8-59baa25f2491" containerName="oc" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.288640 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="86115e2b-93f7-439b-99e8-59baa25f2491" containerName="oc" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.288799 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="86115e2b-93f7-439b-99e8-59baa25f2491" containerName="oc" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.288816 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec1ce09-e145-4077-90a6-39e531280898" containerName="registry-server" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.289912 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.293361 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cxl8l" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.301395 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b"] Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.326954 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-bundle\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.327051 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hqdg\" (UniqueName: \"kubernetes.io/projected/44c9d07a-5579-487a-8ef5-b86c849166ff-kube-api-access-2hqdg\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.327082 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-util\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.427878 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-util\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.427929 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hqdg\" (UniqueName: \"kubernetes.io/projected/44c9d07a-5579-487a-8ef5-b86c849166ff-kube-api-access-2hqdg\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.427996 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-bundle\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.428436 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-bundle\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.428890 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-util\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.449722 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hqdg\" (UniqueName: \"kubernetes.io/projected/44c9d07a-5579-487a-8ef5-b86c849166ff-kube-api-access-2hqdg\") pod \"23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:25 crc kubenswrapper[4703]: I0309 13:40:25.608897 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:26 crc kubenswrapper[4703]: I0309 13:40:26.039410 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b"] Mar 09 13:40:26 crc kubenswrapper[4703]: I0309 13:40:26.378585 4703 generic.go:334] "Generic (PLEG): container finished" podID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerID="e82fc16aa929fec22073aa7b689502f0c09d7dd0a24661be28cb43d88ebc858d" exitCode=0 Mar 09 13:40:26 crc kubenswrapper[4703]: I0309 13:40:26.378643 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" event={"ID":"44c9d07a-5579-487a-8ef5-b86c849166ff","Type":"ContainerDied","Data":"e82fc16aa929fec22073aa7b689502f0c09d7dd0a24661be28cb43d88ebc858d"} Mar 09 13:40:26 crc kubenswrapper[4703]: I0309 13:40:26.378948 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" event={"ID":"44c9d07a-5579-487a-8ef5-b86c849166ff","Type":"ContainerStarted","Data":"fdb8a1ddba5a6a58844e3f092bca5c4f9bff44b71d01b04a6c889998c3c63610"} Mar 09 13:40:26 crc kubenswrapper[4703]: I0309 13:40:26.381220 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:40:26 crc kubenswrapper[4703]: I0309 13:40:26.646126 4703 scope.go:117] "RemoveContainer" containerID="0f4e9edfbc9aad297ec87e9b5f4c229b40c2f094d58b5abe2a633c3836a42535" Mar 09 13:40:27 crc kubenswrapper[4703]: I0309 13:40:27.394430 4703 generic.go:334] "Generic (PLEG): container finished" podID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerID="2aa1b5408cefe64fe06380dbac60b17f9050ade6d25f89496f034660467007a8" exitCode=0 Mar 09 13:40:27 crc kubenswrapper[4703]: I0309 13:40:27.394529 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" event={"ID":"44c9d07a-5579-487a-8ef5-b86c849166ff","Type":"ContainerDied","Data":"2aa1b5408cefe64fe06380dbac60b17f9050ade6d25f89496f034660467007a8"} Mar 09 13:40:28 crc kubenswrapper[4703]: I0309 13:40:28.408766 4703 generic.go:334] "Generic (PLEG): container finished" podID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerID="a787fc6c4a4228f73efd5b310736629312dbe98466b00f7e398cc6bfae44b308" exitCode=0 Mar 09 13:40:28 crc kubenswrapper[4703]: I0309 13:40:28.408832 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" event={"ID":"44c9d07a-5579-487a-8ef5-b86c849166ff","Type":"ContainerDied","Data":"a787fc6c4a4228f73efd5b310736629312dbe98466b00f7e398cc6bfae44b308"} Mar 09 13:40:29 crc kubenswrapper[4703]: I0309 13:40:29.747191 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:29 crc kubenswrapper[4703]: I0309 13:40:29.906381 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-util\") pod \"44c9d07a-5579-487a-8ef5-b86c849166ff\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " Mar 09 13:40:29 crc kubenswrapper[4703]: I0309 13:40:29.906482 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-bundle\") pod \"44c9d07a-5579-487a-8ef5-b86c849166ff\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " Mar 09 13:40:29 crc kubenswrapper[4703]: I0309 13:40:29.906706 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hqdg\" (UniqueName: \"kubernetes.io/projected/44c9d07a-5579-487a-8ef5-b86c849166ff-kube-api-access-2hqdg\") pod \"44c9d07a-5579-487a-8ef5-b86c849166ff\" (UID: \"44c9d07a-5579-487a-8ef5-b86c849166ff\") " Mar 09 13:40:29 crc kubenswrapper[4703]: I0309 13:40:29.911626 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-bundle" (OuterVolumeSpecName: "bundle") pod "44c9d07a-5579-487a-8ef5-b86c849166ff" (UID: "44c9d07a-5579-487a-8ef5-b86c849166ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:40:29 crc kubenswrapper[4703]: I0309 13:40:29.922978 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-util" (OuterVolumeSpecName: "util") pod "44c9d07a-5579-487a-8ef5-b86c849166ff" (UID: "44c9d07a-5579-487a-8ef5-b86c849166ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:40:29 crc kubenswrapper[4703]: I0309 13:40:29.924272 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c9d07a-5579-487a-8ef5-b86c849166ff-kube-api-access-2hqdg" (OuterVolumeSpecName: "kube-api-access-2hqdg") pod "44c9d07a-5579-487a-8ef5-b86c849166ff" (UID: "44c9d07a-5579-487a-8ef5-b86c849166ff"). InnerVolumeSpecName "kube-api-access-2hqdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:40:30 crc kubenswrapper[4703]: I0309 13:40:30.008235 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hqdg\" (UniqueName: \"kubernetes.io/projected/44c9d07a-5579-487a-8ef5-b86c849166ff-kube-api-access-2hqdg\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4703]: I0309 13:40:30.008268 4703 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-util\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4703]: I0309 13:40:30.008280 4703 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c9d07a-5579-487a-8ef5-b86c849166ff-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:40:30 crc kubenswrapper[4703]: I0309 13:40:30.429596 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" event={"ID":"44c9d07a-5579-487a-8ef5-b86c849166ff","Type":"ContainerDied","Data":"fdb8a1ddba5a6a58844e3f092bca5c4f9bff44b71d01b04a6c889998c3c63610"} Mar 09 13:40:30 crc kubenswrapper[4703]: I0309 13:40:30.429666 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb8a1ddba5a6a58844e3f092bca5c4f9bff44b71d01b04a6c889998c3c63610" Mar 09 13:40:30 crc kubenswrapper[4703]: I0309 13:40:30.429757 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.442349 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf"] Mar 09 13:40:39 crc kubenswrapper[4703]: E0309 13:40:39.444011 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerName="extract" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.444099 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerName="extract" Mar 09 13:40:39 crc kubenswrapper[4703]: E0309 13:40:39.444180 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerName="util" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.444242 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerName="util" Mar 09 13:40:39 crc kubenswrapper[4703]: E0309 13:40:39.444299 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerName="pull" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.444354 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerName="pull" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.444531 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" containerName="extract" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.445133 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.447401 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tvr66" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.447503 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-service-cert" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.450115 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-apiservice-cert\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.450198 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5kxx\" (UniqueName: \"kubernetes.io/projected/9b93aff6-9e4d-446c-9882-b572d92049db-kube-api-access-z5kxx\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.450240 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-webhook-cert\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.455026 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf"] Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.551599 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5kxx\" (UniqueName: \"kubernetes.io/projected/9b93aff6-9e4d-446c-9882-b572d92049db-kube-api-access-z5kxx\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.551990 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-webhook-cert\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.552164 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-apiservice-cert\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.558373 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-apiservice-cert\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.558930 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-webhook-cert\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.573100 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5kxx\" (UniqueName: \"kubernetes.io/projected/9b93aff6-9e4d-446c-9882-b572d92049db-kube-api-access-z5kxx\") pod \"manila-operator-controller-manager-7b694485dc-62gvf\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:39 crc kubenswrapper[4703]: I0309 13:40:39.773393 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:40 crc kubenswrapper[4703]: I0309 13:40:40.238989 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf"] Mar 09 13:40:40 crc kubenswrapper[4703]: W0309 13:40:40.249949 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b93aff6_9e4d_446c_9882_b572d92049db.slice/crio-68e06b1824eb1382eeae01eb25ebdf1f1cf574506a631784145f7bb6418bb8c2 WatchSource:0}: Error finding container 68e06b1824eb1382eeae01eb25ebdf1f1cf574506a631784145f7bb6418bb8c2: Status 404 returned error can't find the container with id 68e06b1824eb1382eeae01eb25ebdf1f1cf574506a631784145f7bb6418bb8c2 Mar 09 13:40:40 crc kubenswrapper[4703]: I0309 13:40:40.518517 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" event={"ID":"9b93aff6-9e4d-446c-9882-b572d92049db","Type":"ContainerStarted","Data":"68e06b1824eb1382eeae01eb25ebdf1f1cf574506a631784145f7bb6418bb8c2"} Mar 09 13:40:42 crc kubenswrapper[4703]: I0309 13:40:42.533508 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" event={"ID":"9b93aff6-9e4d-446c-9882-b572d92049db","Type":"ContainerStarted","Data":"082ebc0ed6faefa2e0d09e8c2a5d615882f29741f9d792d2981831bff9706549"} Mar 09 13:40:42 crc kubenswrapper[4703]: I0309 13:40:42.534168 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:42 crc kubenswrapper[4703]: I0309 13:40:42.553380 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" podStartSLOduration=2.312402926 podStartE2EDuration="3.55336522s" podCreationTimestamp="2026-03-09 13:40:39 +0000 UTC" firstStartedPulling="2026-03-09 13:40:40.256475884 +0000 UTC m=+1236.223891610" lastFinishedPulling="2026-03-09 13:40:41.497438208 +0000 UTC m=+1237.464853904" observedRunningTime="2026-03-09 13:40:42.551878828 +0000 UTC m=+1238.519294514" watchObservedRunningTime="2026-03-09 13:40:42.55336522 +0000 UTC m=+1238.520780906" Mar 09 13:40:49 crc kubenswrapper[4703]: I0309 13:40:49.778184 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.735520 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-db-create-d9tx7"] Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.737063 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.741135 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-create-d9tx7"] Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.826226 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-operator-scripts\") pod \"manila-db-create-d9tx7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.826556 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nv6n\" (UniqueName: \"kubernetes.io/projected/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-kube-api-access-2nv6n\") pod \"manila-db-create-d9tx7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.831344 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-829d-account-create-update-zftrv"] Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.832159 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.833984 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-db-secret" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.843822 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-829d-account-create-update-zftrv"] Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.927541 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-operator-scripts\") pod \"manila-db-create-d9tx7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.927593 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk8jm\" (UniqueName: \"kubernetes.io/projected/7ec56df1-32a2-428a-98a3-4e824aaca5be-kube-api-access-kk8jm\") pod \"manila-829d-account-create-update-zftrv\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.927639 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec56df1-32a2-428a-98a3-4e824aaca5be-operator-scripts\") pod \"manila-829d-account-create-update-zftrv\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.927715 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nv6n\" (UniqueName: \"kubernetes.io/projected/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-kube-api-access-2nv6n\") pod \"manila-db-create-d9tx7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.928448 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-operator-scripts\") pod \"manila-db-create-d9tx7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:57 crc kubenswrapper[4703]: I0309 13:40:57.945885 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nv6n\" (UniqueName: \"kubernetes.io/projected/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-kube-api-access-2nv6n\") pod \"manila-db-create-d9tx7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.029319 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk8jm\" (UniqueName: \"kubernetes.io/projected/7ec56df1-32a2-428a-98a3-4e824aaca5be-kube-api-access-kk8jm\") pod \"manila-829d-account-create-update-zftrv\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.029414 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec56df1-32a2-428a-98a3-4e824aaca5be-operator-scripts\") pod \"manila-829d-account-create-update-zftrv\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.030695 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec56df1-32a2-428a-98a3-4e824aaca5be-operator-scripts\") pod \"manila-829d-account-create-update-zftrv\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.047850 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk8jm\" (UniqueName: \"kubernetes.io/projected/7ec56df1-32a2-428a-98a3-4e824aaca5be-kube-api-access-kk8jm\") pod \"manila-829d-account-create-update-zftrv\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.064630 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.151838 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.498598 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-create-d9tx7"] Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.547544 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-829d-account-create-update-zftrv"] Mar 09 13:40:58 crc kubenswrapper[4703]: W0309 13:40:58.548819 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ec56df1_32a2_428a_98a3_4e824aaca5be.slice/crio-d7986c5dbc12b53de8346134c2dc6b245a97311e3f9bd222a696d9431227960a WatchSource:0}: Error finding container d7986c5dbc12b53de8346134c2dc6b245a97311e3f9bd222a696d9431227960a: Status 404 returned error can't find the container with id d7986c5dbc12b53de8346134c2dc6b245a97311e3f9bd222a696d9431227960a Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.654192 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" event={"ID":"7ec56df1-32a2-428a-98a3-4e824aaca5be","Type":"ContainerStarted","Data":"fa8ff7679c48da18ad72979fd0bd9429bb61cf1f55c33bb431306d56f86b0804"} Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.654442 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" event={"ID":"7ec56df1-32a2-428a-98a3-4e824aaca5be","Type":"ContainerStarted","Data":"d7986c5dbc12b53de8346134c2dc6b245a97311e3f9bd222a696d9431227960a"} Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.656392 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-d9tx7" event={"ID":"23a35046-9fb7-49ec-86d9-1fdc270b2cb7","Type":"ContainerStarted","Data":"6d189dd2bdf061adcbabd7355f537a98c02269eeafc420de3f83bacbf27fd7b9"} Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.656438 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-d9tx7" event={"ID":"23a35046-9fb7-49ec-86d9-1fdc270b2cb7","Type":"ContainerStarted","Data":"3e606046fe039c5a11baa4da36012155d0490e636de8c125a48d2ec99f1ab198"} Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.676812 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" podStartSLOduration=1.676790287 podStartE2EDuration="1.676790287s" podCreationTimestamp="2026-03-09 13:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:58.670192342 +0000 UTC m=+1254.637608028" watchObservedRunningTime="2026-03-09 13:40:58.676790287 +0000 UTC m=+1254.644205973" Mar 09 13:40:58 crc kubenswrapper[4703]: I0309 13:40:58.683225 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-db-create-d9tx7" podStartSLOduration=1.683206478 podStartE2EDuration="1.683206478s" podCreationTimestamp="2026-03-09 13:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:40:58.682527389 +0000 UTC m=+1254.649943075" watchObservedRunningTime="2026-03-09 13:40:58.683206478 +0000 UTC m=+1254.650622174" Mar 09 13:40:59 crc kubenswrapper[4703]: I0309 13:40:59.663969 4703 generic.go:334] "Generic (PLEG): container finished" podID="7ec56df1-32a2-428a-98a3-4e824aaca5be" containerID="fa8ff7679c48da18ad72979fd0bd9429bb61cf1f55c33bb431306d56f86b0804" exitCode=0 Mar 09 13:40:59 crc kubenswrapper[4703]: I0309 13:40:59.664048 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" event={"ID":"7ec56df1-32a2-428a-98a3-4e824aaca5be","Type":"ContainerDied","Data":"fa8ff7679c48da18ad72979fd0bd9429bb61cf1f55c33bb431306d56f86b0804"} Mar 09 13:40:59 crc kubenswrapper[4703]: I0309 13:40:59.666473 4703 generic.go:334] "Generic (PLEG): container finished" podID="23a35046-9fb7-49ec-86d9-1fdc270b2cb7" containerID="6d189dd2bdf061adcbabd7355f537a98c02269eeafc420de3f83bacbf27fd7b9" exitCode=0 Mar 09 13:40:59 crc kubenswrapper[4703]: I0309 13:40:59.666533 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-d9tx7" event={"ID":"23a35046-9fb7-49ec-86d9-1fdc270b2cb7","Type":"ContainerDied","Data":"6d189dd2bdf061adcbabd7355f537a98c02269eeafc420de3f83bacbf27fd7b9"} Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.025695 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.035777 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.079979 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk8jm\" (UniqueName: \"kubernetes.io/projected/7ec56df1-32a2-428a-98a3-4e824aaca5be-kube-api-access-kk8jm\") pod \"7ec56df1-32a2-428a-98a3-4e824aaca5be\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.080020 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-operator-scripts\") pod \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.080184 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec56df1-32a2-428a-98a3-4e824aaca5be-operator-scripts\") pod \"7ec56df1-32a2-428a-98a3-4e824aaca5be\" (UID: \"7ec56df1-32a2-428a-98a3-4e824aaca5be\") " Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.080203 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nv6n\" (UniqueName: \"kubernetes.io/projected/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-kube-api-access-2nv6n\") pod \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\" (UID: \"23a35046-9fb7-49ec-86d9-1fdc270b2cb7\") " Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.081010 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec56df1-32a2-428a-98a3-4e824aaca5be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ec56df1-32a2-428a-98a3-4e824aaca5be" (UID: "7ec56df1-32a2-428a-98a3-4e824aaca5be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.081067 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23a35046-9fb7-49ec-86d9-1fdc270b2cb7" (UID: "23a35046-9fb7-49ec-86d9-1fdc270b2cb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.085216 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec56df1-32a2-428a-98a3-4e824aaca5be-kube-api-access-kk8jm" (OuterVolumeSpecName: "kube-api-access-kk8jm") pod "7ec56df1-32a2-428a-98a3-4e824aaca5be" (UID: "7ec56df1-32a2-428a-98a3-4e824aaca5be"). InnerVolumeSpecName "kube-api-access-kk8jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.085995 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-kube-api-access-2nv6n" (OuterVolumeSpecName: "kube-api-access-2nv6n") pod "23a35046-9fb7-49ec-86d9-1fdc270b2cb7" (UID: "23a35046-9fb7-49ec-86d9-1fdc270b2cb7"). InnerVolumeSpecName "kube-api-access-2nv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.181509 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ec56df1-32a2-428a-98a3-4e824aaca5be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.181564 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nv6n\" (UniqueName: \"kubernetes.io/projected/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-kube-api-access-2nv6n\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.181588 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk8jm\" (UniqueName: \"kubernetes.io/projected/7ec56df1-32a2-428a-98a3-4e824aaca5be-kube-api-access-kk8jm\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.181607 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a35046-9fb7-49ec-86d9-1fdc270b2cb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.685767 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-d9tx7" event={"ID":"23a35046-9fb7-49ec-86d9-1fdc270b2cb7","Type":"ContainerDied","Data":"3e606046fe039c5a11baa4da36012155d0490e636de8c125a48d2ec99f1ab198"} Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.685796 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-d9tx7" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.685814 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e606046fe039c5a11baa4da36012155d0490e636de8c125a48d2ec99f1ab198" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.701200 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" event={"ID":"7ec56df1-32a2-428a-98a3-4e824aaca5be","Type":"ContainerDied","Data":"d7986c5dbc12b53de8346134c2dc6b245a97311e3f9bd222a696d9431227960a"} Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.701252 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7986c5dbc12b53de8346134c2dc6b245a97311e3f9bd222a696d9431227960a" Mar 09 13:41:01 crc kubenswrapper[4703]: I0309 13:41:01.701336 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-829d-account-create-update-zftrv" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.154212 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-db-sync-bvnrg"] Mar 09 13:41:03 crc kubenswrapper[4703]: E0309 13:41:03.155480 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec56df1-32a2-428a-98a3-4e824aaca5be" containerName="mariadb-account-create-update" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.155513 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec56df1-32a2-428a-98a3-4e824aaca5be" containerName="mariadb-account-create-update" Mar 09 13:41:03 crc kubenswrapper[4703]: E0309 13:41:03.155544 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a35046-9fb7-49ec-86d9-1fdc270b2cb7" containerName="mariadb-database-create" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.155557 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a35046-9fb7-49ec-86d9-1fdc270b2cb7" containerName="mariadb-database-create" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.155958 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a35046-9fb7-49ec-86d9-1fdc270b2cb7" containerName="mariadb-database-create" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.155990 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec56df1-32a2-428a-98a3-4e824aaca5be" containerName="mariadb-account-create-update" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.156749 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.160194 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-manila-dockercfg-tjdx7" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.160736 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-config-data" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.170520 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-sync-bvnrg"] Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.315549 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-config-data\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.315586 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-job-config-data\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.315667 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4ss\" (UniqueName: \"kubernetes.io/projected/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-kube-api-access-6n4ss\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.417494 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4ss\" (UniqueName: \"kubernetes.io/projected/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-kube-api-access-6n4ss\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.417586 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-config-data\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.417612 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-job-config-data\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.423924 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-job-config-data\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.424378 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-config-data\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.439574 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4ss\" (UniqueName: \"kubernetes.io/projected/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-kube-api-access-6n4ss\") pod \"manila-db-sync-bvnrg\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.482571 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:03 crc kubenswrapper[4703]: I0309 13:41:03.935556 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-sync-bvnrg"] Mar 09 13:41:04 crc kubenswrapper[4703]: I0309 13:41:04.722329 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-bvnrg" event={"ID":"042542ba-8fa8-4c28-b930-75e5b6c0b4ff","Type":"ContainerStarted","Data":"9dce0e88c6826d994bdacb5bc7058ea058bbebafcc2f2c84fc3d4ae4a2714396"} Mar 09 13:41:08 crc kubenswrapper[4703]: I0309 13:41:08.751021 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-bvnrg" event={"ID":"042542ba-8fa8-4c28-b930-75e5b6c0b4ff","Type":"ContainerStarted","Data":"b0ede2a6d6f11ad35470934f54c977630b49b1951fbf2371fbed23fe3d896274"} Mar 09 13:41:08 crc kubenswrapper[4703]: I0309 13:41:08.766618 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-db-sync-bvnrg" podStartSLOduration=1.7469252210000001 podStartE2EDuration="5.76660217s" podCreationTimestamp="2026-03-09 13:41:03 +0000 UTC" firstStartedPulling="2026-03-09 13:41:03.940615891 +0000 UTC m=+1259.908031577" lastFinishedPulling="2026-03-09 13:41:07.96029284 +0000 UTC m=+1263.927708526" observedRunningTime="2026-03-09 13:41:08.766511267 +0000 UTC m=+1264.733926963" watchObservedRunningTime="2026-03-09 13:41:08.76660217 +0000 UTC m=+1264.734017856" Mar 09 13:41:20 crc kubenswrapper[4703]: I0309 13:41:20.833198 4703 generic.go:334] "Generic (PLEG): container finished" podID="042542ba-8fa8-4c28-b930-75e5b6c0b4ff" containerID="b0ede2a6d6f11ad35470934f54c977630b49b1951fbf2371fbed23fe3d896274" exitCode=0 Mar 09 13:41:20 crc kubenswrapper[4703]: I0309 13:41:20.833367 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-bvnrg" event={"ID":"042542ba-8fa8-4c28-b930-75e5b6c0b4ff","Type":"ContainerDied","Data":"b0ede2a6d6f11ad35470934f54c977630b49b1951fbf2371fbed23fe3d896274"} Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.118092 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.208048 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-config-data\") pod \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.208115 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-job-config-data\") pod \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.208226 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n4ss\" (UniqueName: \"kubernetes.io/projected/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-kube-api-access-6n4ss\") pod \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\" (UID: \"042542ba-8fa8-4c28-b930-75e5b6c0b4ff\") " Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.212681 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "042542ba-8fa8-4c28-b930-75e5b6c0b4ff" (UID: "042542ba-8fa8-4c28-b930-75e5b6c0b4ff"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.212748 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-kube-api-access-6n4ss" (OuterVolumeSpecName: "kube-api-access-6n4ss") pod "042542ba-8fa8-4c28-b930-75e5b6c0b4ff" (UID: "042542ba-8fa8-4c28-b930-75e5b6c0b4ff"). InnerVolumeSpecName "kube-api-access-6n4ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.215095 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-config-data" (OuterVolumeSpecName: "config-data") pod "042542ba-8fa8-4c28-b930-75e5b6c0b4ff" (UID: "042542ba-8fa8-4c28-b930-75e5b6c0b4ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.309706 4703 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.309740 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n4ss\" (UniqueName: \"kubernetes.io/projected/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-kube-api-access-6n4ss\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.309752 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/042542ba-8fa8-4c28-b930-75e5b6c0b4ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.851691 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-bvnrg" event={"ID":"042542ba-8fa8-4c28-b930-75e5b6c0b4ff","Type":"ContainerDied","Data":"9dce0e88c6826d994bdacb5bc7058ea058bbebafcc2f2c84fc3d4ae4a2714396"} Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.851736 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dce0e88c6826d994bdacb5bc7058ea058bbebafcc2f2c84fc3d4ae4a2714396" Mar 09 13:41:22 crc kubenswrapper[4703]: I0309 13:41:22.851772 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-bvnrg" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.145024 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: E0309 13:41:23.145375 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="042542ba-8fa8-4c28-b930-75e5b6c0b4ff" containerName="manila-db-sync" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.145391 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="042542ba-8fa8-4c28-b930-75e5b6c0b4ff" containerName="manila-db-sync" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.145551 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="042542ba-8fa8-4c28-b930-75e5b6c0b4ff" containerName="manila-db-sync" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.146390 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.152354 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.153306 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.156005 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"ceph-conf-files" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.156021 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-share-share0-config-data" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.157967 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.159480 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-config-data" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.159699 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-scheduler-config-data" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.159859 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-manila-dockercfg-tjdx7" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.159881 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-scripts" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.164943 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.220988 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mccnk\" (UniqueName: \"kubernetes.io/projected/978210ad-7b9d-4cfb-9248-5ee38ebbe489-kube-api-access-mccnk\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.221034 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-scripts\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.221060 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.221076 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.221115 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978210ad-7b9d-4cfb-9248-5ee38ebbe489-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.322854 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbvwd\" (UniqueName: \"kubernetes.io/projected/6e06cecc-5443-47b2-8936-85ac055ea57f-kube-api-access-bbvwd\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.322966 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mccnk\" (UniqueName: \"kubernetes.io/projected/978210ad-7b9d-4cfb-9248-5ee38ebbe489-kube-api-access-mccnk\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323052 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-scripts\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323069 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-ceph\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323086 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323126 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-scripts\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323235 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323299 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323373 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323464 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978210ad-7b9d-4cfb-9248-5ee38ebbe489-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323540 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978210ad-7b9d-4cfb-9248-5ee38ebbe489-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323600 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.323647 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.327839 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.328154 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.328232 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-scripts\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.339789 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mccnk\" (UniqueName: \"kubernetes.io/projected/978210ad-7b9d-4cfb-9248-5ee38ebbe489-kube-api-access-mccnk\") pod \"manila-scheduler-0\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.399739 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.401105 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.402939 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-api-config-data" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.424491 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-ceph\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.424540 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.424557 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-scripts\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.424598 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.424649 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.424662 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.424690 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvwd\" (UniqueName: \"kubernetes.io/projected/6e06cecc-5443-47b2-8936-85ac055ea57f-kube-api-access-bbvwd\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.425017 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.425217 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.425355 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.432235 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-scripts\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.432255 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.433509 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-ceph\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.445304 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.464523 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvwd\" (UniqueName: \"kubernetes.io/projected/6e06cecc-5443-47b2-8936-85ac055ea57f-kube-api-access-bbvwd\") pod \"manila-share-share0-0\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.475406 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.483719 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.526456 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-logs\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.526518 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56bzp\" (UniqueName: \"kubernetes.io/projected/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-kube-api-access-56bzp\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.526579 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-scripts\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.526612 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.526652 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data-custom\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.526672 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-etc-machine-id\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.628128 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.628199 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data-custom\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.628228 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-etc-machine-id\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.628299 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-logs\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.628333 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56bzp\" (UniqueName: \"kubernetes.io/projected/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-kube-api-access-56bzp\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.628402 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-scripts\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.628991 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-etc-machine-id\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.629517 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-logs\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.634384 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.634724 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data-custom\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.635329 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-scripts\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.649698 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56bzp\" (UniqueName: \"kubernetes.io/projected/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-kube-api-access-56bzp\") pod \"manila-api-0\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.719204 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.814554 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.865677 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"6e06cecc-5443-47b2-8936-85ac055ea57f","Type":"ContainerStarted","Data":"d4201a72e646c4a50d357536476f4658214412741d6e5f3b83b3a3321b175766"} Mar 09 13:41:23 crc kubenswrapper[4703]: I0309 13:41:23.926216 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:41:23 crc kubenswrapper[4703]: W0309 13:41:23.934294 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod978210ad_7b9d_4cfb_9248_5ee38ebbe489.slice/crio-9d23f88347bd3f6ebad00bbe4d2412135138f9b64040f8c5d163008f56eacc09 WatchSource:0}: Error finding container 9d23f88347bd3f6ebad00bbe4d2412135138f9b64040f8c5d163008f56eacc09: Status 404 returned error can't find the container with id 9d23f88347bd3f6ebad00bbe4d2412135138f9b64040f8c5d163008f56eacc09 Mar 09 13:41:24 crc kubenswrapper[4703]: I0309 13:41:24.176777 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:41:24 crc kubenswrapper[4703]: W0309 13:41:24.186233 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod812c8f6e_a1ed_40b8_9aa9_7d6a637ecfac.slice/crio-8087fe00657dce92b454eaf335a0eeea37cf802143be5c43fc793e1a50601b10 WatchSource:0}: Error finding container 8087fe00657dce92b454eaf335a0eeea37cf802143be5c43fc793e1a50601b10: Status 404 returned error can't find the container with id 8087fe00657dce92b454eaf335a0eeea37cf802143be5c43fc793e1a50601b10 Mar 09 13:41:24 crc kubenswrapper[4703]: I0309 13:41:24.875119 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac","Type":"ContainerStarted","Data":"8b8af184830a6b6048951b0b80db23b040fa32b65bbb3e0110fed7ce7ccc7cdf"} Mar 09 13:41:24 crc kubenswrapper[4703]: I0309 13:41:24.875175 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac","Type":"ContainerStarted","Data":"8087fe00657dce92b454eaf335a0eeea37cf802143be5c43fc793e1a50601b10"} Mar 09 13:41:24 crc kubenswrapper[4703]: I0309 13:41:24.876304 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"978210ad-7b9d-4cfb-9248-5ee38ebbe489","Type":"ContainerStarted","Data":"9d23f88347bd3f6ebad00bbe4d2412135138f9b64040f8c5d163008f56eacc09"} Mar 09 13:41:25 crc kubenswrapper[4703]: I0309 13:41:25.886372 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"978210ad-7b9d-4cfb-9248-5ee38ebbe489","Type":"ContainerStarted","Data":"855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8"} Mar 09 13:41:25 crc kubenswrapper[4703]: I0309 13:41:25.886956 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"978210ad-7b9d-4cfb-9248-5ee38ebbe489","Type":"ContainerStarted","Data":"694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1"} Mar 09 13:41:25 crc kubenswrapper[4703]: I0309 13:41:25.889695 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac","Type":"ContainerStarted","Data":"1603c6b8c50178a5d0277808e76141a21e957f4317d8a8f443937691d07464b9"} Mar 09 13:41:25 crc kubenswrapper[4703]: I0309 13:41:25.889872 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:25 crc kubenswrapper[4703]: I0309 13:41:25.937320 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-api-0" podStartSLOduration=2.937298974 podStartE2EDuration="2.937298974s" podCreationTimestamp="2026-03-09 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:25.935303468 +0000 UTC m=+1281.902719154" watchObservedRunningTime="2026-03-09 13:41:25.937298974 +0000 UTC m=+1281.904714650" Mar 09 13:41:25 crc kubenswrapper[4703]: I0309 13:41:25.939012 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-scheduler-0" podStartSLOduration=2.093724563 podStartE2EDuration="2.939004752s" podCreationTimestamp="2026-03-09 13:41:23 +0000 UTC" firstStartedPulling="2026-03-09 13:41:23.943659061 +0000 UTC m=+1279.911074747" lastFinishedPulling="2026-03-09 13:41:24.78893925 +0000 UTC m=+1280.756354936" observedRunningTime="2026-03-09 13:41:25.905687223 +0000 UTC m=+1281.873102909" watchObservedRunningTime="2026-03-09 13:41:25.939004752 +0000 UTC m=+1281.906420438" Mar 09 13:41:29 crc kubenswrapper[4703]: I0309 13:41:29.927703 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"6e06cecc-5443-47b2-8936-85ac055ea57f","Type":"ContainerStarted","Data":"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241"} Mar 09 13:41:29 crc kubenswrapper[4703]: I0309 13:41:29.928323 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"6e06cecc-5443-47b2-8936-85ac055ea57f","Type":"ContainerStarted","Data":"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884"} Mar 09 13:41:29 crc kubenswrapper[4703]: I0309 13:41:29.959463 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-share-share0-0" podStartSLOduration=2.064237192 podStartE2EDuration="6.959436842s" podCreationTimestamp="2026-03-09 13:41:23 +0000 UTC" firstStartedPulling="2026-03-09 13:41:23.831232404 +0000 UTC m=+1279.798648090" lastFinishedPulling="2026-03-09 13:41:28.726432044 +0000 UTC m=+1284.693847740" observedRunningTime="2026-03-09 13:41:29.95012661 +0000 UTC m=+1285.917542306" watchObservedRunningTime="2026-03-09 13:41:29.959436842 +0000 UTC m=+1285.926852568" Mar 09 13:41:33 crc kubenswrapper[4703]: I0309 13:41:33.475950 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:33 crc kubenswrapper[4703]: I0309 13:41:33.485576 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:44 crc kubenswrapper[4703]: I0309 13:41:44.918644 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:41:44 crc kubenswrapper[4703]: I0309 13:41:44.926221 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:41:44 crc kubenswrapper[4703]: I0309 13:41:44.975691 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.110373 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-api-1"] Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.112263 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.128779 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-api-2"] Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.130098 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.147834 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-2"] Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.155698 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-1"] Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.181767 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9567766d-4f6a-4f80-ad66-1d9b031b76b2-logs\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.181824 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.181870 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data-custom\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.181908 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9567766d-4f6a-4f80-ad66-1d9b031b76b2-etc-machine-id\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.181932 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvx98\" (UniqueName: \"kubernetes.io/projected/f94ce884-2989-4345-b4c2-2d06e2fe5de6-kube-api-access-jvx98\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.181962 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-scripts\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.182006 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94ce884-2989-4345-b4c2-2d06e2fe5de6-etc-machine-id\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.182031 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-scripts\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.182060 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76vr5\" (UniqueName: \"kubernetes.io/projected/9567766d-4f6a-4f80-ad66-1d9b031b76b2-kube-api-access-76vr5\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.182085 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94ce884-2989-4345-b4c2-2d06e2fe5de6-logs\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.182124 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.182162 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data-custom\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283042 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data-custom\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283103 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9567766d-4f6a-4f80-ad66-1d9b031b76b2-etc-machine-id\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283121 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvx98\" (UniqueName: \"kubernetes.io/projected/f94ce884-2989-4345-b4c2-2d06e2fe5de6-kube-api-access-jvx98\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283183 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9567766d-4f6a-4f80-ad66-1d9b031b76b2-etc-machine-id\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283450 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-scripts\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283510 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94ce884-2989-4345-b4c2-2d06e2fe5de6-etc-machine-id\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283530 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-scripts\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283580 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94ce884-2989-4345-b4c2-2d06e2fe5de6-etc-machine-id\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283605 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76vr5\" (UniqueName: \"kubernetes.io/projected/9567766d-4f6a-4f80-ad66-1d9b031b76b2-kube-api-access-76vr5\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283624 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94ce884-2989-4345-b4c2-2d06e2fe5de6-logs\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283800 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.283988 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94ce884-2989-4345-b4c2-2d06e2fe5de6-logs\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.284126 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data-custom\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.284154 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9567766d-4f6a-4f80-ad66-1d9b031b76b2-logs\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.284175 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.284886 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9567766d-4f6a-4f80-ad66-1d9b031b76b2-logs\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.288289 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data-custom\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.289503 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.291323 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.292309 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-scripts\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.292449 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-scripts\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.298802 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76vr5\" (UniqueName: \"kubernetes.io/projected/9567766d-4f6a-4f80-ad66-1d9b031b76b2-kube-api-access-76vr5\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.302267 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data-custom\") pod \"manila-api-1\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.303511 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvx98\" (UniqueName: \"kubernetes.io/projected/f94ce884-2989-4345-b4c2-2d06e2fe5de6-kube-api-access-jvx98\") pod \"manila-api-2\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.433834 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.447063 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.939802 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-1"] Mar 09 13:41:47 crc kubenswrapper[4703]: W0309 13:41:47.942426 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9567766d_4f6a_4f80_ad66_1d9b031b76b2.slice/crio-2ebfb86c89a0b21a8c0133ab96f804df5ef12955109e2a2f6e1a3a4294cb560e WatchSource:0}: Error finding container 2ebfb86c89a0b21a8c0133ab96f804df5ef12955109e2a2f6e1a3a4294cb560e: Status 404 returned error can't find the container with id 2ebfb86c89a0b21a8c0133ab96f804df5ef12955109e2a2f6e1a3a4294cb560e Mar 09 13:41:47 crc kubenswrapper[4703]: I0309 13:41:47.984770 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-2"] Mar 09 13:41:47 crc kubenswrapper[4703]: W0309 13:41:47.990820 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf94ce884_2989_4345_b4c2_2d06e2fe5de6.slice/crio-2d8a7decd88cc10ea5aba52f701c914e64c4e370e7591014d353736eccd6fd18 WatchSource:0}: Error finding container 2d8a7decd88cc10ea5aba52f701c914e64c4e370e7591014d353736eccd6fd18: Status 404 returned error can't find the container with id 2d8a7decd88cc10ea5aba52f701c914e64c4e370e7591014d353736eccd6fd18 Mar 09 13:41:48 crc kubenswrapper[4703]: I0309 13:41:48.101506 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-1" event={"ID":"9567766d-4f6a-4f80-ad66-1d9b031b76b2","Type":"ContainerStarted","Data":"2ebfb86c89a0b21a8c0133ab96f804df5ef12955109e2a2f6e1a3a4294cb560e"} Mar 09 13:41:48 crc kubenswrapper[4703]: I0309 13:41:48.103304 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-2" event={"ID":"f94ce884-2989-4345-b4c2-2d06e2fe5de6","Type":"ContainerStarted","Data":"2d8a7decd88cc10ea5aba52f701c914e64c4e370e7591014d353736eccd6fd18"} Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.113722 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-2" event={"ID":"f94ce884-2989-4345-b4c2-2d06e2fe5de6","Type":"ContainerStarted","Data":"36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982"} Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.114375 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/manila-api-2" Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.114405 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-2" event={"ID":"f94ce884-2989-4345-b4c2-2d06e2fe5de6","Type":"ContainerStarted","Data":"8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637"} Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.122176 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-1" event={"ID":"9567766d-4f6a-4f80-ad66-1d9b031b76b2","Type":"ContainerStarted","Data":"894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9"} Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.122254 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-1" event={"ID":"9567766d-4f6a-4f80-ad66-1d9b031b76b2","Type":"ContainerStarted","Data":"8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78"} Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.123208 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/manila-api-1" Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.147956 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-api-2" podStartSLOduration=2.147939108 podStartE2EDuration="2.147939108s" podCreationTimestamp="2026-03-09 13:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:49.140526492 +0000 UTC m=+1305.107942178" watchObservedRunningTime="2026-03-09 13:41:49.147939108 +0000 UTC m=+1305.115354794" Mar 09 13:41:49 crc kubenswrapper[4703]: I0309 13:41:49.176168 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-api-1" podStartSLOduration=2.176130241 podStartE2EDuration="2.176130241s" podCreationTimestamp="2026-03-09 13:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:41:49.167140591 +0000 UTC m=+1305.134556277" watchObservedRunningTime="2026-03-09 13:41:49.176130241 +0000 UTC m=+1305.143545967" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.159318 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551062-lw8cm"] Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.161533 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.164924 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.165286 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.165489 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.183313 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-lw8cm"] Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.287078 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrn6c\" (UniqueName: \"kubernetes.io/projected/8c507239-c234-4c0f-b792-43e2c592106c-kube-api-access-wrn6c\") pod \"auto-csr-approver-29551062-lw8cm\" (UID: \"8c507239-c234-4c0f-b792-43e2c592106c\") " pod="openshift-infra/auto-csr-approver-29551062-lw8cm" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.389513 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrn6c\" (UniqueName: \"kubernetes.io/projected/8c507239-c234-4c0f-b792-43e2c592106c-kube-api-access-wrn6c\") pod \"auto-csr-approver-29551062-lw8cm\" (UID: \"8c507239-c234-4c0f-b792-43e2c592106c\") " pod="openshift-infra/auto-csr-approver-29551062-lw8cm" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.428589 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrn6c\" (UniqueName: \"kubernetes.io/projected/8c507239-c234-4c0f-b792-43e2c592106c-kube-api-access-wrn6c\") pod \"auto-csr-approver-29551062-lw8cm\" (UID: \"8c507239-c234-4c0f-b792-43e2c592106c\") " pod="openshift-infra/auto-csr-approver-29551062-lw8cm" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.483190 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" Mar 09 13:42:00 crc kubenswrapper[4703]: I0309 13:42:00.729195 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-lw8cm"] Mar 09 13:42:00 crc kubenswrapper[4703]: W0309 13:42:00.736806 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c507239_c234_4c0f_b792_43e2c592106c.slice/crio-8f7a3a4c12882efcfc9a82b3b48ffb3ac5694d7310d751b6a5b0a41834bf12ee WatchSource:0}: Error finding container 8f7a3a4c12882efcfc9a82b3b48ffb3ac5694d7310d751b6a5b0a41834bf12ee: Status 404 returned error can't find the container with id 8f7a3a4c12882efcfc9a82b3b48ffb3ac5694d7310d751b6a5b0a41834bf12ee Mar 09 13:42:01 crc kubenswrapper[4703]: I0309 13:42:01.236345 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" event={"ID":"8c507239-c234-4c0f-b792-43e2c592106c","Type":"ContainerStarted","Data":"8f7a3a4c12882efcfc9a82b3b48ffb3ac5694d7310d751b6a5b0a41834bf12ee"} Mar 09 13:42:02 crc kubenswrapper[4703]: I0309 13:42:02.245464 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" event={"ID":"8c507239-c234-4c0f-b792-43e2c592106c","Type":"ContainerStarted","Data":"fc153240335b020ba4fe47c249a70fd899fc69a21c7c6187417de0fea757e6b8"} Mar 09 13:42:03 crc kubenswrapper[4703]: I0309 13:42:03.257009 4703 generic.go:334] "Generic (PLEG): container finished" podID="8c507239-c234-4c0f-b792-43e2c592106c" containerID="fc153240335b020ba4fe47c249a70fd899fc69a21c7c6187417de0fea757e6b8" exitCode=0 Mar 09 13:42:03 crc kubenswrapper[4703]: I0309 13:42:03.257079 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" event={"ID":"8c507239-c234-4c0f-b792-43e2c592106c","Type":"ContainerDied","Data":"fc153240335b020ba4fe47c249a70fd899fc69a21c7c6187417de0fea757e6b8"} Mar 09 13:42:04 crc kubenswrapper[4703]: I0309 13:42:04.618713 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" Mar 09 13:42:04 crc kubenswrapper[4703]: I0309 13:42:04.754484 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrn6c\" (UniqueName: \"kubernetes.io/projected/8c507239-c234-4c0f-b792-43e2c592106c-kube-api-access-wrn6c\") pod \"8c507239-c234-4c0f-b792-43e2c592106c\" (UID: \"8c507239-c234-4c0f-b792-43e2c592106c\") " Mar 09 13:42:04 crc kubenswrapper[4703]: I0309 13:42:04.779429 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c507239-c234-4c0f-b792-43e2c592106c-kube-api-access-wrn6c" (OuterVolumeSpecName: "kube-api-access-wrn6c") pod "8c507239-c234-4c0f-b792-43e2c592106c" (UID: "8c507239-c234-4c0f-b792-43e2c592106c"). InnerVolumeSpecName "kube-api-access-wrn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:04 crc kubenswrapper[4703]: I0309 13:42:04.856548 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrn6c\" (UniqueName: \"kubernetes.io/projected/8c507239-c234-4c0f-b792-43e2c592106c-kube-api-access-wrn6c\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:05 crc kubenswrapper[4703]: I0309 13:42:05.279640 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" event={"ID":"8c507239-c234-4c0f-b792-43e2c592106c","Type":"ContainerDied","Data":"8f7a3a4c12882efcfc9a82b3b48ffb3ac5694d7310d751b6a5b0a41834bf12ee"} Mar 09 13:42:05 crc kubenswrapper[4703]: I0309 13:42:05.280091 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f7a3a4c12882efcfc9a82b3b48ffb3ac5694d7310d751b6a5b0a41834bf12ee" Mar 09 13:42:05 crc kubenswrapper[4703]: I0309 13:42:05.279704 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551062-lw8cm" Mar 09 13:42:05 crc kubenswrapper[4703]: I0309 13:42:05.704807 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-wc7wk"] Mar 09 13:42:05 crc kubenswrapper[4703]: I0309 13:42:05.712702 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551056-wc7wk"] Mar 09 13:42:06 crc kubenswrapper[4703]: I0309 13:42:06.729905 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e" path="/var/lib/kubelet/pods/c8eb7ded-2d2b-4610-bd30-2b9fd1a6d57e/volumes" Mar 09 13:42:08 crc kubenswrapper[4703]: I0309 13:42:08.687512 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/manila-api-2" Mar 09 13:42:08 crc kubenswrapper[4703]: I0309 13:42:08.785226 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/manila-api-1" Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.356558 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-2"] Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.356767 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-2" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api-log" containerID="cri-o://8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637" gracePeriod=30 Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.357146 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-2" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api" containerID="cri-o://36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982" gracePeriod=30 Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.377951 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-1"] Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.378633 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-1" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api" containerID="cri-o://894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9" gracePeriod=30 Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.379449 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-1" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api-log" containerID="cri-o://8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78" gracePeriod=30 Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.500471 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:42:09 crc kubenswrapper[4703]: I0309 13:42:09.500781 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:42:10 crc kubenswrapper[4703]: I0309 13:42:10.329045 4703 generic.go:334] "Generic (PLEG): container finished" podID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerID="8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637" exitCode=143 Mar 09 13:42:10 crc kubenswrapper[4703]: I0309 13:42:10.329139 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-2" event={"ID":"f94ce884-2989-4345-b4c2-2d06e2fe5de6","Type":"ContainerDied","Data":"8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637"} Mar 09 13:42:10 crc kubenswrapper[4703]: I0309 13:42:10.332438 4703 generic.go:334] "Generic (PLEG): container finished" podID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerID="8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78" exitCode=143 Mar 09 13:42:10 crc kubenswrapper[4703]: I0309 13:42:10.332472 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-1" event={"ID":"9567766d-4f6a-4f80-ad66-1d9b031b76b2","Type":"ContainerDied","Data":"8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78"} Mar 09 13:42:12 crc kubenswrapper[4703]: I0309 13:42:12.912023 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-2" Mar 09 13:42:12 crc kubenswrapper[4703]: I0309 13:42:12.918424 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-1" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088006 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76vr5\" (UniqueName: \"kubernetes.io/projected/9567766d-4f6a-4f80-ad66-1d9b031b76b2-kube-api-access-76vr5\") pod \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088048 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data-custom\") pod \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088085 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-scripts\") pod \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088128 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9567766d-4f6a-4f80-ad66-1d9b031b76b2-etc-machine-id\") pod \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088148 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data\") pod \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088191 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data\") pod \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088211 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvx98\" (UniqueName: \"kubernetes.io/projected/f94ce884-2989-4345-b4c2-2d06e2fe5de6-kube-api-access-jvx98\") pod \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088233 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9567766d-4f6a-4f80-ad66-1d9b031b76b2-logs\") pod \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088258 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94ce884-2989-4345-b4c2-2d06e2fe5de6-etc-machine-id\") pod \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088300 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-scripts\") pod \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088321 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data-custom\") pod \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\" (UID: \"9567766d-4f6a-4f80-ad66-1d9b031b76b2\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088359 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94ce884-2989-4345-b4c2-2d06e2fe5de6-logs\") pod \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\" (UID: \"f94ce884-2989-4345-b4c2-2d06e2fe5de6\") " Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088679 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9567766d-4f6a-4f80-ad66-1d9b031b76b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9567766d-4f6a-4f80-ad66-1d9b031b76b2" (UID: "9567766d-4f6a-4f80-ad66-1d9b031b76b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.088724 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f94ce884-2989-4345-b4c2-2d06e2fe5de6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f94ce884-2989-4345-b4c2-2d06e2fe5de6" (UID: "f94ce884-2989-4345-b4c2-2d06e2fe5de6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.089047 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94ce884-2989-4345-b4c2-2d06e2fe5de6-logs" (OuterVolumeSpecName: "logs") pod "f94ce884-2989-4345-b4c2-2d06e2fe5de6" (UID: "f94ce884-2989-4345-b4c2-2d06e2fe5de6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.089058 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9567766d-4f6a-4f80-ad66-1d9b031b76b2-logs" (OuterVolumeSpecName: "logs") pod "9567766d-4f6a-4f80-ad66-1d9b031b76b2" (UID: "9567766d-4f6a-4f80-ad66-1d9b031b76b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.093927 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9567766d-4f6a-4f80-ad66-1d9b031b76b2-kube-api-access-76vr5" (OuterVolumeSpecName: "kube-api-access-76vr5") pod "9567766d-4f6a-4f80-ad66-1d9b031b76b2" (UID: "9567766d-4f6a-4f80-ad66-1d9b031b76b2"). InnerVolumeSpecName "kube-api-access-76vr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.094403 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-scripts" (OuterVolumeSpecName: "scripts") pod "9567766d-4f6a-4f80-ad66-1d9b031b76b2" (UID: "9567766d-4f6a-4f80-ad66-1d9b031b76b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.094512 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f94ce884-2989-4345-b4c2-2d06e2fe5de6" (UID: "f94ce884-2989-4345-b4c2-2d06e2fe5de6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.094656 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-scripts" (OuterVolumeSpecName: "scripts") pod "f94ce884-2989-4345-b4c2-2d06e2fe5de6" (UID: "f94ce884-2989-4345-b4c2-2d06e2fe5de6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.095169 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94ce884-2989-4345-b4c2-2d06e2fe5de6-kube-api-access-jvx98" (OuterVolumeSpecName: "kube-api-access-jvx98") pod "f94ce884-2989-4345-b4c2-2d06e2fe5de6" (UID: "f94ce884-2989-4345-b4c2-2d06e2fe5de6"). InnerVolumeSpecName "kube-api-access-jvx98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.096037 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9567766d-4f6a-4f80-ad66-1d9b031b76b2" (UID: "9567766d-4f6a-4f80-ad66-1d9b031b76b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.126226 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data" (OuterVolumeSpecName: "config-data") pod "f94ce884-2989-4345-b4c2-2d06e2fe5de6" (UID: "f94ce884-2989-4345-b4c2-2d06e2fe5de6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.132176 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data" (OuterVolumeSpecName: "config-data") pod "9567766d-4f6a-4f80-ad66-1d9b031b76b2" (UID: "9567766d-4f6a-4f80-ad66-1d9b031b76b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190433 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9567766d-4f6a-4f80-ad66-1d9b031b76b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190472 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190482 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190493 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvx98\" (UniqueName: \"kubernetes.io/projected/f94ce884-2989-4345-b4c2-2d06e2fe5de6-kube-api-access-jvx98\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190503 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9567766d-4f6a-4f80-ad66-1d9b031b76b2-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190512 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f94ce884-2989-4345-b4c2-2d06e2fe5de6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190521 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190530 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9567766d-4f6a-4f80-ad66-1d9b031b76b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190537 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f94ce884-2989-4345-b4c2-2d06e2fe5de6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190546 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76vr5\" (UniqueName: \"kubernetes.io/projected/9567766d-4f6a-4f80-ad66-1d9b031b76b2-kube-api-access-76vr5\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190556 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.190565 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f94ce884-2989-4345-b4c2-2d06e2fe5de6-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.364285 4703 generic.go:334] "Generic (PLEG): container finished" podID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerID="36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982" exitCode=0 Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.364391 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-2" event={"ID":"f94ce884-2989-4345-b4c2-2d06e2fe5de6","Type":"ContainerDied","Data":"36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982"} Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.364401 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-2" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.364432 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-2" event={"ID":"f94ce884-2989-4345-b4c2-2d06e2fe5de6","Type":"ContainerDied","Data":"2d8a7decd88cc10ea5aba52f701c914e64c4e370e7591014d353736eccd6fd18"} Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.364463 4703 scope.go:117] "RemoveContainer" containerID="36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.368827 4703 generic.go:334] "Generic (PLEG): container finished" podID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerID="894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9" exitCode=0 Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.368917 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-1" event={"ID":"9567766d-4f6a-4f80-ad66-1d9b031b76b2","Type":"ContainerDied","Data":"894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9"} Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.368958 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-1" event={"ID":"9567766d-4f6a-4f80-ad66-1d9b031b76b2","Type":"ContainerDied","Data":"2ebfb86c89a0b21a8c0133ab96f804df5ef12955109e2a2f6e1a3a4294cb560e"} Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.369060 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-1" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.399973 4703 scope.go:117] "RemoveContainer" containerID="8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.416958 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-1"] Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.424759 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-api-1"] Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.435882 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-2"] Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.440070 4703 scope.go:117] "RemoveContainer" containerID="36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982" Mar 09 13:42:13 crc kubenswrapper[4703]: E0309 13:42:13.440618 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982\": container with ID starting with 36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982 not found: ID does not exist" containerID="36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.440705 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982"} err="failed to get container status \"36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982\": rpc error: code = NotFound desc = could not find container \"36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982\": container with ID starting with 36250babd5916751aa0c70d11d77fd7c6b0029a5716156c04e6aa70d5677e982 not found: ID does not exist" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.440746 4703 scope.go:117] "RemoveContainer" containerID="8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637" Mar 09 13:42:13 crc kubenswrapper[4703]: E0309 13:42:13.441335 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637\": container with ID starting with 8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637 not found: ID does not exist" containerID="8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.441429 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637"} err="failed to get container status \"8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637\": rpc error: code = NotFound desc = could not find container \"8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637\": container with ID starting with 8ccae82db0e49a223f4a30e15f0d3c81167cc9573d4bdcb027ffd9a438064637 not found: ID does not exist" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.441483 4703 scope.go:117] "RemoveContainer" containerID="894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.447604 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-api-2"] Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.462874 4703 scope.go:117] "RemoveContainer" containerID="8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.485153 4703 scope.go:117] "RemoveContainer" containerID="894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9" Mar 09 13:42:13 crc kubenswrapper[4703]: E0309 13:42:13.485656 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9\": container with ID starting with 894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9 not found: ID does not exist" containerID="894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.485691 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9"} err="failed to get container status \"894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9\": rpc error: code = NotFound desc = could not find container \"894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9\": container with ID starting with 894ef11ad23dc8593dd6ab64e93afc8b81ad52d4c7415c3647307ff7cff116e9 not found: ID does not exist" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.485716 4703 scope.go:117] "RemoveContainer" containerID="8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78" Mar 09 13:42:13 crc kubenswrapper[4703]: E0309 13:42:13.486071 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78\": container with ID starting with 8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78 not found: ID does not exist" containerID="8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78" Mar 09 13:42:13 crc kubenswrapper[4703]: I0309 13:42:13.486098 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78"} err="failed to get container status \"8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78\": rpc error: code = NotFound desc = could not find container \"8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78\": container with ID starting with 8ea055d1a1dbc727dcc6b2b8c1eea407a50415e69faa21e5cd3f01052f7e3a78 not found: ID does not exist" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.723494 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" path="/var/lib/kubelet/pods/9567766d-4f6a-4f80-ad66-1d9b031b76b2/volumes" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.724881 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" path="/var/lib/kubelet/pods/f94ce884-2989-4345-b4c2-2d06e2fe5de6/volumes" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.844968 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-scheduler-1"] Mar 09 13:42:14 crc kubenswrapper[4703]: E0309 13:42:14.845401 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api-log" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845431 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api-log" Mar 09 13:42:14 crc kubenswrapper[4703]: E0309 13:42:14.845453 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api-log" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845465 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api-log" Mar 09 13:42:14 crc kubenswrapper[4703]: E0309 13:42:14.845503 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845516 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api" Mar 09 13:42:14 crc kubenswrapper[4703]: E0309 13:42:14.845533 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845562 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api" Mar 09 13:42:14 crc kubenswrapper[4703]: E0309 13:42:14.845577 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c507239-c234-4c0f-b792-43e2c592106c" containerName="oc" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845588 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c507239-c234-4c0f-b792-43e2c592106c" containerName="oc" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845798 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api-log" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845819 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c507239-c234-4c0f-b792-43e2c592106c" containerName="oc" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845877 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845905 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9567766d-4f6a-4f80-ad66-1d9b031b76b2" containerName="manila-api-log" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.845929 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94ce884-2989-4345-b4c2-2d06e2fe5de6" containerName="manila-api" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.847235 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:14 crc kubenswrapper[4703]: I0309 13:42:14.872629 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-1"] Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.023422 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data-custom\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.023559 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.023649 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-etc-machine-id\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.023703 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-scripts\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.023747 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmvdk\" (UniqueName: \"kubernetes.io/projected/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-kube-api-access-mmvdk\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.125148 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-scripts\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.125203 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvdk\" (UniqueName: \"kubernetes.io/projected/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-kube-api-access-mmvdk\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.125419 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data-custom\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.125470 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.125516 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-etc-machine-id\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.125570 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-etc-machine-id\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.131782 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.132065 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data-custom\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.132495 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-scripts\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.145235 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvdk\" (UniqueName: \"kubernetes.io/projected/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-kube-api-access-mmvdk\") pod \"manila-scheduler-1\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.173537 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:15 crc kubenswrapper[4703]: I0309 13:42:15.602216 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-1"] Mar 09 13:42:16 crc kubenswrapper[4703]: I0309 13:42:16.395638 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-1" event={"ID":"0ef9316d-eaf8-48a3-8172-cd81f3ebab38","Type":"ContainerStarted","Data":"841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077"} Mar 09 13:42:16 crc kubenswrapper[4703]: I0309 13:42:16.396162 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-1" event={"ID":"0ef9316d-eaf8-48a3-8172-cd81f3ebab38","Type":"ContainerStarted","Data":"191bed86377fb9cdedc5fca8adc9b01e876d213f7f3b3549489100d416c67395"} Mar 09 13:42:17 crc kubenswrapper[4703]: I0309 13:42:17.405991 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-1" event={"ID":"0ef9316d-eaf8-48a3-8172-cd81f3ebab38","Type":"ContainerStarted","Data":"323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12"} Mar 09 13:42:17 crc kubenswrapper[4703]: I0309 13:42:17.432765 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-scheduler-1" podStartSLOduration=3.4327351 podStartE2EDuration="3.4327351s" podCreationTimestamp="2026-03-09 13:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:17.427552426 +0000 UTC m=+1333.394968142" watchObservedRunningTime="2026-03-09 13:42:17.4327351 +0000 UTC m=+1333.400150826" Mar 09 13:42:25 crc kubenswrapper[4703]: I0309 13:42:25.174657 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:26 crc kubenswrapper[4703]: I0309 13:42:26.790088 4703 scope.go:117] "RemoveContainer" containerID="c74e409538490bf48b93f38832cde9297064709b7b4c29b68d8aaca4d57b6e80" Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.802645 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.875547 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-scheduler-2"] Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.876825 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.889546 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-2"] Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.999231 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-scripts\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.999393 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85x9x\" (UniqueName: \"kubernetes.io/projected/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-kube-api-access-85x9x\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.999438 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-etc-machine-id\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.999476 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data-custom\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:36 crc kubenswrapper[4703]: I0309 13:42:36.999498 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.101071 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-scripts\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.101244 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85x9x\" (UniqueName: \"kubernetes.io/projected/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-kube-api-access-85x9x\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.101347 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-etc-machine-id\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.101483 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-etc-machine-id\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.101605 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data-custom\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.102827 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.107242 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-scripts\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.107741 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.107824 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data-custom\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.116425 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85x9x\" (UniqueName: \"kubernetes.io/projected/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-kube-api-access-85x9x\") pod \"manila-scheduler-2\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.193911 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.465352 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-2"] Mar 09 13:42:37 crc kubenswrapper[4703]: I0309 13:42:37.622090 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-2" event={"ID":"a74a24f0-1a47-4ec0-9c06-4362768ae4ee","Type":"ContainerStarted","Data":"35de41e593cc6e80921206b6ca6848a48d309b95e5440d76b101d7e2cf68d42a"} Mar 09 13:42:38 crc kubenswrapper[4703]: I0309 13:42:38.634340 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-2" event={"ID":"a74a24f0-1a47-4ec0-9c06-4362768ae4ee","Type":"ContainerStarted","Data":"fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06"} Mar 09 13:42:38 crc kubenswrapper[4703]: I0309 13:42:38.634918 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-2" event={"ID":"a74a24f0-1a47-4ec0-9c06-4362768ae4ee","Type":"ContainerStarted","Data":"6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9"} Mar 09 13:42:38 crc kubenswrapper[4703]: I0309 13:42:38.663872 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-scheduler-2" podStartSLOduration=2.663832852 podStartE2EDuration="2.663832852s" podCreationTimestamp="2026-03-09 13:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:42:38.660797097 +0000 UTC m=+1354.628212793" watchObservedRunningTime="2026-03-09 13:42:38.663832852 +0000 UTC m=+1354.631248538" Mar 09 13:42:39 crc kubenswrapper[4703]: I0309 13:42:39.500135 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:42:39 crc kubenswrapper[4703]: I0309 13:42:39.500190 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:42:47 crc kubenswrapper[4703]: I0309 13:42:47.194893 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:48 crc kubenswrapper[4703]: I0309 13:42:48.694895 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.321147 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-db-sync-bvnrg"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.327623 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-db-sync-bvnrg"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.351013 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-2"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.351228 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-2" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="manila-scheduler" containerID="cri-o://6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.351302 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-2" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="probe" containerID="cri-o://fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.359272 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.359746 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-0" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="manila-scheduler" containerID="cri-o://694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.360087 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-0" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="probe" containerID="cri-o://855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.365328 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-1"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.365594 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-1" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="manila-scheduler" containerID="cri-o://841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.366020 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-1" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="probe" containerID="cri-o://323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.386815 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila829d-account-delete-4m64p"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.387599 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.398502 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila829d-account-delete-4m64p"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.408681 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.423360 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share0-0" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="manila-share" containerID="cri-o://7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.417764 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share0-0" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="probe" containerID="cri-o://ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.426363 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd3fcb56-0a55-43e9-93d9-124fc2269d75-operator-scripts\") pod \"manila829d-account-delete-4m64p\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.426445 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf79x\" (UniqueName: \"kubernetes.io/projected/cd3fcb56-0a55-43e9-93d9-124fc2269d75-kube-api-access-sf79x\") pod \"manila829d-account-delete-4m64p\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.485401 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.485652 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-0" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api-log" containerID="cri-o://8b8af184830a6b6048951b0b80db23b040fa32b65bbb3e0110fed7ce7ccc7cdf" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.485790 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-0" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api" containerID="cri-o://1603c6b8c50178a5d0277808e76141a21e957f4317d8a8f443937691d07464b9" gracePeriod=30 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.528837 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd3fcb56-0a55-43e9-93d9-124fc2269d75-operator-scripts\") pod \"manila829d-account-delete-4m64p\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.529191 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf79x\" (UniqueName: \"kubernetes.io/projected/cd3fcb56-0a55-43e9-93d9-124fc2269d75-kube-api-access-sf79x\") pod \"manila829d-account-delete-4m64p\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.530205 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd3fcb56-0a55-43e9-93d9-124fc2269d75-operator-scripts\") pod \"manila829d-account-delete-4m64p\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.546957 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf79x\" (UniqueName: \"kubernetes.io/projected/cd3fcb56-0a55-43e9-93d9-124fc2269d75-kube-api-access-sf79x\") pod \"manila829d-account-delete-4m64p\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.713968 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="042542ba-8fa8-4c28-b930-75e5b6c0b4ff" path="/var/lib/kubelet/pods/042542ba-8fa8-4c28-b930-75e5b6c0b4ff/volumes" Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.740064 4703 generic.go:334] "Generic (PLEG): container finished" podID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerID="8b8af184830a6b6048951b0b80db23b040fa32b65bbb3e0110fed7ce7ccc7cdf" exitCode=143 Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.740123 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac","Type":"ContainerDied","Data":"8b8af184830a6b6048951b0b80db23b040fa32b65bbb3e0110fed7ce7ccc7cdf"} Mar 09 13:42:50 crc kubenswrapper[4703]: I0309 13:42:50.750257 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.202452 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila829d-account-delete-4m64p"] Mar 09 13:42:51 crc kubenswrapper[4703]: W0309 13:42:51.211679 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd3fcb56_0a55_43e9_93d9_124fc2269d75.slice/crio-ea3d58e8470f1333d8f30a4c07c267ee724e28e065a4188c97dde0ae2c47d422 WatchSource:0}: Error finding container ea3d58e8470f1333d8f30a4c07c267ee724e28e065a4188c97dde0ae2c47d422: Status 404 returned error can't find the container with id ea3d58e8470f1333d8f30a4c07c267ee724e28e065a4188c97dde0ae2c47d422 Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.246874 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445282 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data\") pod \"6e06cecc-5443-47b2-8936-85ac055ea57f\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445361 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-scripts\") pod \"6e06cecc-5443-47b2-8936-85ac055ea57f\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445384 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbvwd\" (UniqueName: \"kubernetes.io/projected/6e06cecc-5443-47b2-8936-85ac055ea57f-kube-api-access-bbvwd\") pod \"6e06cecc-5443-47b2-8936-85ac055ea57f\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445412 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-etc-machine-id\") pod \"6e06cecc-5443-47b2-8936-85ac055ea57f\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445472 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data-custom\") pod \"6e06cecc-5443-47b2-8936-85ac055ea57f\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445495 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-ceph\") pod \"6e06cecc-5443-47b2-8936-85ac055ea57f\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445509 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-var-lib-manila\") pod \"6e06cecc-5443-47b2-8936-85ac055ea57f\" (UID: \"6e06cecc-5443-47b2-8936-85ac055ea57f\") " Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445639 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6e06cecc-5443-47b2-8936-85ac055ea57f" (UID: "6e06cecc-5443-47b2-8936-85ac055ea57f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.445724 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "6e06cecc-5443-47b2-8936-85ac055ea57f" (UID: "6e06cecc-5443-47b2-8936-85ac055ea57f"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.446449 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.446469 4703 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6e06cecc-5443-47b2-8936-85ac055ea57f-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.452316 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-ceph" (OuterVolumeSpecName: "ceph") pod "6e06cecc-5443-47b2-8936-85ac055ea57f" (UID: "6e06cecc-5443-47b2-8936-85ac055ea57f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.452312 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e06cecc-5443-47b2-8936-85ac055ea57f-kube-api-access-bbvwd" (OuterVolumeSpecName: "kube-api-access-bbvwd") pod "6e06cecc-5443-47b2-8936-85ac055ea57f" (UID: "6e06cecc-5443-47b2-8936-85ac055ea57f"). InnerVolumeSpecName "kube-api-access-bbvwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.452647 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-scripts" (OuterVolumeSpecName: "scripts") pod "6e06cecc-5443-47b2-8936-85ac055ea57f" (UID: "6e06cecc-5443-47b2-8936-85ac055ea57f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.452825 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6e06cecc-5443-47b2-8936-85ac055ea57f" (UID: "6e06cecc-5443-47b2-8936-85ac055ea57f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.523228 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data" (OuterVolumeSpecName: "config-data") pod "6e06cecc-5443-47b2-8936-85ac055ea57f" (UID: "6e06cecc-5443-47b2-8936-85ac055ea57f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.548123 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.548160 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.548174 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbvwd\" (UniqueName: \"kubernetes.io/projected/6e06cecc-5443-47b2-8936-85ac055ea57f-kube-api-access-bbvwd\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.548188 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.548201 4703 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6e06cecc-5443-47b2-8936-85ac055ea57f-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.753977 4703 generic.go:334] "Generic (PLEG): container finished" podID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerID="855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8" exitCode=0 Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.754064 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"978210ad-7b9d-4cfb-9248-5ee38ebbe489","Type":"ContainerDied","Data":"855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.756289 4703 generic.go:334] "Generic (PLEG): container finished" podID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerID="fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06" exitCode=0 Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.756311 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-2" event={"ID":"a74a24f0-1a47-4ec0-9c06-4362768ae4ee","Type":"ContainerDied","Data":"fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.758732 4703 generic.go:334] "Generic (PLEG): container finished" podID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerID="323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12" exitCode=0 Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.758773 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-1" event={"ID":"0ef9316d-eaf8-48a3-8172-cd81f3ebab38","Type":"ContainerDied","Data":"323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.761294 4703 generic.go:334] "Generic (PLEG): container finished" podID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerID="ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241" exitCode=0 Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.761317 4703 generic.go:334] "Generic (PLEG): container finished" podID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerID="7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884" exitCode=1 Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.761372 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.761376 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"6e06cecc-5443-47b2-8936-85ac055ea57f","Type":"ContainerDied","Data":"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.761482 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"6e06cecc-5443-47b2-8936-85ac055ea57f","Type":"ContainerDied","Data":"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.761507 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"6e06cecc-5443-47b2-8936-85ac055ea57f","Type":"ContainerDied","Data":"d4201a72e646c4a50d357536476f4658214412741d6e5f3b83b3a3321b175766"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.761521 4703 scope.go:117] "RemoveContainer" containerID="ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.763631 4703 generic.go:334] "Generic (PLEG): container finished" podID="cd3fcb56-0a55-43e9-93d9-124fc2269d75" containerID="73c722e627e04c8631389202420e69e40ce36adf8ecc45a47499418b2d2b6639" exitCode=0 Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.763668 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila829d-account-delete-4m64p" event={"ID":"cd3fcb56-0a55-43e9-93d9-124fc2269d75","Type":"ContainerDied","Data":"73c722e627e04c8631389202420e69e40ce36adf8ecc45a47499418b2d2b6639"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.763694 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila829d-account-delete-4m64p" event={"ID":"cd3fcb56-0a55-43e9-93d9-124fc2269d75","Type":"ContainerStarted","Data":"ea3d58e8470f1333d8f30a4c07c267ee724e28e065a4188c97dde0ae2c47d422"} Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.782322 4703 scope.go:117] "RemoveContainer" containerID="7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.809097 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.810163 4703 scope.go:117] "RemoveContainer" containerID="ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241" Mar 09 13:42:51 crc kubenswrapper[4703]: E0309 13:42:51.810695 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241\": container with ID starting with ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241 not found: ID does not exist" containerID="ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.810741 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241"} err="failed to get container status \"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241\": rpc error: code = NotFound desc = could not find container \"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241\": container with ID starting with ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241 not found: ID does not exist" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.810770 4703 scope.go:117] "RemoveContainer" containerID="7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884" Mar 09 13:42:51 crc kubenswrapper[4703]: E0309 13:42:51.811229 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884\": container with ID starting with 7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884 not found: ID does not exist" containerID="7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.811265 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884"} err="failed to get container status \"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884\": rpc error: code = NotFound desc = could not find container \"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884\": container with ID starting with 7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884 not found: ID does not exist" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.811293 4703 scope.go:117] "RemoveContainer" containerID="ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.811562 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241"} err="failed to get container status \"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241\": rpc error: code = NotFound desc = could not find container \"ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241\": container with ID starting with ace02df985a696b1c29ad5486e34527753a7e2c297ca6d4a7cca5aea49fce241 not found: ID does not exist" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.811579 4703 scope.go:117] "RemoveContainer" containerID="7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.811928 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884"} err="failed to get container status \"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884\": rpc error: code = NotFound desc = could not find container \"7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884\": container with ID starting with 7e51e3658ca73b2f35252189c57552f0a414c7f222d8036853717e2364a98884 not found: ID does not exist" Mar 09 13:42:51 crc kubenswrapper[4703]: I0309 13:42:51.822923 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.375770 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.562043 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-etc-machine-id\") pod \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.562082 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data\") pod \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.562109 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmvdk\" (UniqueName: \"kubernetes.io/projected/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-kube-api-access-mmvdk\") pod \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.562132 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-scripts\") pod \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.562155 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data-custom\") pod \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\" (UID: \"0ef9316d-eaf8-48a3-8172-cd81f3ebab38\") " Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.562162 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0ef9316d-eaf8-48a3-8172-cd81f3ebab38" (UID: "0ef9316d-eaf8-48a3-8172-cd81f3ebab38"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.562373 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.582038 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ef9316d-eaf8-48a3-8172-cd81f3ebab38" (UID: "0ef9316d-eaf8-48a3-8172-cd81f3ebab38"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.582188 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-scripts" (OuterVolumeSpecName: "scripts") pod "0ef9316d-eaf8-48a3-8172-cd81f3ebab38" (UID: "0ef9316d-eaf8-48a3-8172-cd81f3ebab38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.582220 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-kube-api-access-mmvdk" (OuterVolumeSpecName: "kube-api-access-mmvdk") pod "0ef9316d-eaf8-48a3-8172-cd81f3ebab38" (UID: "0ef9316d-eaf8-48a3-8172-cd81f3ebab38"). InnerVolumeSpecName "kube-api-access-mmvdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.636182 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data" (OuterVolumeSpecName: "config-data") pod "0ef9316d-eaf8-48a3-8172-cd81f3ebab38" (UID: "0ef9316d-eaf8-48a3-8172-cd81f3ebab38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.663715 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.663744 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmvdk\" (UniqueName: \"kubernetes.io/projected/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-kube-api-access-mmvdk\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.663755 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.663764 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ef9316d-eaf8-48a3-8172-cd81f3ebab38-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.717166 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" path="/var/lib/kubelet/pods/6e06cecc-5443-47b2-8936-85ac055ea57f/volumes" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.772077 4703 generic.go:334] "Generic (PLEG): container finished" podID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerID="841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077" exitCode=0 Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.772166 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-1" event={"ID":"0ef9316d-eaf8-48a3-8172-cd81f3ebab38","Type":"ContainerDied","Data":"841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077"} Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.772201 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-1" event={"ID":"0ef9316d-eaf8-48a3-8172-cd81f3ebab38","Type":"ContainerDied","Data":"191bed86377fb9cdedc5fca8adc9b01e876d213f7f3b3549489100d416c67395"} Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.772226 4703 scope.go:117] "RemoveContainer" containerID="323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.772597 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-1" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.794502 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-1"] Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.800897 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-scheduler-1"] Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.803264 4703 scope.go:117] "RemoveContainer" containerID="841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.820510 4703 scope.go:117] "RemoveContainer" containerID="323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12" Mar 09 13:42:52 crc kubenswrapper[4703]: E0309 13:42:52.820889 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12\": container with ID starting with 323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12 not found: ID does not exist" containerID="323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.820949 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12"} err="failed to get container status \"323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12\": rpc error: code = NotFound desc = could not find container \"323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12\": container with ID starting with 323afa96929e1ca6685f5ccb5bc90b77dc1cec252dd817da61222b0790ee2f12 not found: ID does not exist" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.820975 4703 scope.go:117] "RemoveContainer" containerID="841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077" Mar 09 13:42:52 crc kubenswrapper[4703]: E0309 13:42:52.821224 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077\": container with ID starting with 841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077 not found: ID does not exist" containerID="841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077" Mar 09 13:42:52 crc kubenswrapper[4703]: I0309 13:42:52.821253 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077"} err="failed to get container status \"841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077\": rpc error: code = NotFound desc = could not find container \"841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077\": container with ID starting with 841c3e9a206d8fada5718c806e870bf91e26ad5a86319a175cd85971edec9077 not found: ID does not exist" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.001749 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:53 crc kubenswrapper[4703]: E0309 13:42:53.127384 4703 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod978210ad_7b9d_4cfb_9248_5ee38ebbe489.slice/crio-694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda74a24f0_1a47_4ec0_9c06_4362768ae4ee.slice/crio-6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.175325 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf79x\" (UniqueName: \"kubernetes.io/projected/cd3fcb56-0a55-43e9-93d9-124fc2269d75-kube-api-access-sf79x\") pod \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.175420 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd3fcb56-0a55-43e9-93d9-124fc2269d75-operator-scripts\") pod \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\" (UID: \"cd3fcb56-0a55-43e9-93d9-124fc2269d75\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.176367 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd3fcb56-0a55-43e9-93d9-124fc2269d75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd3fcb56-0a55-43e9-93d9-124fc2269d75" (UID: "cd3fcb56-0a55-43e9-93d9-124fc2269d75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.178307 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd3fcb56-0a55-43e9-93d9-124fc2269d75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.190036 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3fcb56-0a55-43e9-93d9-124fc2269d75-kube-api-access-sf79x" (OuterVolumeSpecName: "kube-api-access-sf79x") pod "cd3fcb56-0a55-43e9-93d9-124fc2269d75" (UID: "cd3fcb56-0a55-43e9-93d9-124fc2269d75"). InnerVolumeSpecName "kube-api-access-sf79x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.279662 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf79x\" (UniqueName: \"kubernetes.io/projected/cd3fcb56-0a55-43e9-93d9-124fc2269d75-kube-api-access-sf79x\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.392121 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.422793 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.581775 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-etc-machine-id\") pod \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.581894 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85x9x\" (UniqueName: \"kubernetes.io/projected/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-kube-api-access-85x9x\") pod \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.581929 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-scripts\") pod \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.581956 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data-custom\") pod \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.582051 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data\") pod \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.582100 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978210ad-7b9d-4cfb-9248-5ee38ebbe489-etc-machine-id\") pod \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.582140 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data\") pod \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.582179 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mccnk\" (UniqueName: \"kubernetes.io/projected/978210ad-7b9d-4cfb-9248-5ee38ebbe489-kube-api-access-mccnk\") pod \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.582203 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-scripts\") pod \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\" (UID: \"a74a24f0-1a47-4ec0-9c06-4362768ae4ee\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.582242 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data-custom\") pod \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\" (UID: \"978210ad-7b9d-4cfb-9248-5ee38ebbe489\") " Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.583583 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a74a24f0-1a47-4ec0-9c06-4362768ae4ee" (UID: "a74a24f0-1a47-4ec0-9c06-4362768ae4ee"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.583692 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/978210ad-7b9d-4cfb-9248-5ee38ebbe489-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "978210ad-7b9d-4cfb-9248-5ee38ebbe489" (UID: "978210ad-7b9d-4cfb-9248-5ee38ebbe489"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.587088 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-scripts" (OuterVolumeSpecName: "scripts") pod "a74a24f0-1a47-4ec0-9c06-4362768ae4ee" (UID: "a74a24f0-1a47-4ec0-9c06-4362768ae4ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.587644 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a74a24f0-1a47-4ec0-9c06-4362768ae4ee" (UID: "a74a24f0-1a47-4ec0-9c06-4362768ae4ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.588035 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-kube-api-access-85x9x" (OuterVolumeSpecName: "kube-api-access-85x9x") pod "a74a24f0-1a47-4ec0-9c06-4362768ae4ee" (UID: "a74a24f0-1a47-4ec0-9c06-4362768ae4ee"). InnerVolumeSpecName "kube-api-access-85x9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.588117 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978210ad-7b9d-4cfb-9248-5ee38ebbe489-kube-api-access-mccnk" (OuterVolumeSpecName: "kube-api-access-mccnk") pod "978210ad-7b9d-4cfb-9248-5ee38ebbe489" (UID: "978210ad-7b9d-4cfb-9248-5ee38ebbe489"). InnerVolumeSpecName "kube-api-access-mccnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.589393 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-scripts" (OuterVolumeSpecName: "scripts") pod "978210ad-7b9d-4cfb-9248-5ee38ebbe489" (UID: "978210ad-7b9d-4cfb-9248-5ee38ebbe489"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.594106 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "978210ad-7b9d-4cfb-9248-5ee38ebbe489" (UID: "978210ad-7b9d-4cfb-9248-5ee38ebbe489"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683613 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/978210ad-7b9d-4cfb-9248-5ee38ebbe489-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683653 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mccnk\" (UniqueName: \"kubernetes.io/projected/978210ad-7b9d-4cfb-9248-5ee38ebbe489-kube-api-access-mccnk\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683668 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683680 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683691 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683702 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85x9x\" (UniqueName: \"kubernetes.io/projected/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-kube-api-access-85x9x\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683713 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.683723 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.702807 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data" (OuterVolumeSpecName: "config-data") pod "978210ad-7b9d-4cfb-9248-5ee38ebbe489" (UID: "978210ad-7b9d-4cfb-9248-5ee38ebbe489"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.704532 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data" (OuterVolumeSpecName: "config-data") pod "a74a24f0-1a47-4ec0-9c06-4362768ae4ee" (UID: "a74a24f0-1a47-4ec0-9c06-4362768ae4ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.720546 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="manila-kuttl-tests/manila-api-0" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api" probeResult="failure" output="Get \"http://10.217.0.94:8786/healthcheck\": dial tcp 10.217.0.94:8786: connect: connection refused" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.783177 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila829d-account-delete-4m64p" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.784115 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila829d-account-delete-4m64p" event={"ID":"cd3fcb56-0a55-43e9-93d9-124fc2269d75","Type":"ContainerDied","Data":"ea3d58e8470f1333d8f30a4c07c267ee724e28e065a4188c97dde0ae2c47d422"} Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.784153 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea3d58e8470f1333d8f30a4c07c267ee724e28e065a4188c97dde0ae2c47d422" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.784797 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74a24f0-1a47-4ec0-9c06-4362768ae4ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.784821 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/978210ad-7b9d-4cfb-9248-5ee38ebbe489-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.788795 4703 generic.go:334] "Generic (PLEG): container finished" podID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerID="1603c6b8c50178a5d0277808e76141a21e957f4317d8a8f443937691d07464b9" exitCode=0 Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.788891 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac","Type":"ContainerDied","Data":"1603c6b8c50178a5d0277808e76141a21e957f4317d8a8f443937691d07464b9"} Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.791519 4703 generic.go:334] "Generic (PLEG): container finished" podID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerID="694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1" exitCode=0 Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.791573 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.791610 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"978210ad-7b9d-4cfb-9248-5ee38ebbe489","Type":"ContainerDied","Data":"694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1"} Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.791706 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"978210ad-7b9d-4cfb-9248-5ee38ebbe489","Type":"ContainerDied","Data":"9d23f88347bd3f6ebad00bbe4d2412135138f9b64040f8c5d163008f56eacc09"} Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.791735 4703 scope.go:117] "RemoveContainer" containerID="855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.794000 4703 generic.go:334] "Generic (PLEG): container finished" podID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerID="6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9" exitCode=0 Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.794055 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-2" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.794070 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-2" event={"ID":"a74a24f0-1a47-4ec0-9c06-4362768ae4ee","Type":"ContainerDied","Data":"6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9"} Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.794092 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-2" event={"ID":"a74a24f0-1a47-4ec0-9c06-4362768ae4ee","Type":"ContainerDied","Data":"35de41e593cc6e80921206b6ca6848a48d309b95e5440d76b101d7e2cf68d42a"} Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.831987 4703 scope.go:117] "RemoveContainer" containerID="694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.836245 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.847985 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.856475 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-2"] Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.859035 4703 scope.go:117] "RemoveContainer" containerID="855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8" Mar 09 13:42:53 crc kubenswrapper[4703]: E0309 13:42:53.859526 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8\": container with ID starting with 855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8 not found: ID does not exist" containerID="855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.859571 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8"} err="failed to get container status \"855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8\": rpc error: code = NotFound desc = could not find container \"855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8\": container with ID starting with 855c438b17fd6f48c89ac3f87355086e3078feeac1fe499557b9a2a22baa41e8 not found: ID does not exist" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.859602 4703 scope.go:117] "RemoveContainer" containerID="694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1" Mar 09 13:42:53 crc kubenswrapper[4703]: E0309 13:42:53.860059 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1\": container with ID starting with 694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1 not found: ID does not exist" containerID="694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.860084 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1"} err="failed to get container status \"694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1\": rpc error: code = NotFound desc = could not find container \"694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1\": container with ID starting with 694b9d7025a089844429e96fdf5d59644336935d272e121fb0e3e4588c4fb9b1 not found: ID does not exist" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.860102 4703 scope.go:117] "RemoveContainer" containerID="fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.863280 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-scheduler-2"] Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.875593 4703 scope.go:117] "RemoveContainer" containerID="6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.892858 4703 scope.go:117] "RemoveContainer" containerID="fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06" Mar 09 13:42:53 crc kubenswrapper[4703]: E0309 13:42:53.893267 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06\": container with ID starting with fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06 not found: ID does not exist" containerID="fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.893323 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06"} err="failed to get container status \"fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06\": rpc error: code = NotFound desc = could not find container \"fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06\": container with ID starting with fa3b535b04783dd535a6a864655f68d34f1ae1b67c08ad10eb42d0b273b34e06 not found: ID does not exist" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.893350 4703 scope.go:117] "RemoveContainer" containerID="6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9" Mar 09 13:42:53 crc kubenswrapper[4703]: E0309 13:42:53.893930 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9\": container with ID starting with 6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9 not found: ID does not exist" containerID="6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.893965 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9"} err="failed to get container status \"6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9\": rpc error: code = NotFound desc = could not find container \"6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9\": container with ID starting with 6429b4cd8482b9ab17a6e0b9bb6e828e4fd274d54e5629dc3d4a6988364d3ef9 not found: ID does not exist" Mar 09 13:42:53 crc kubenswrapper[4703]: I0309 13:42:53.986884 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.089622 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56bzp\" (UniqueName: \"kubernetes.io/projected/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-kube-api-access-56bzp\") pod \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.090048 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data\") pod \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.090254 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-etc-machine-id\") pod \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.090295 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data-custom\") pod \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.090328 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-logs\") pod \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.090363 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-scripts\") pod \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\" (UID: \"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac\") " Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.090533 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" (UID: "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.090713 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.091032 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-logs" (OuterVolumeSpecName: "logs") pod "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" (UID: "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.094967 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-scripts" (OuterVolumeSpecName: "scripts") pod "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" (UID: "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.095008 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" (UID: "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.095155 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-kube-api-access-56bzp" (OuterVolumeSpecName: "kube-api-access-56bzp") pod "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" (UID: "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac"). InnerVolumeSpecName "kube-api-access-56bzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.126776 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data" (OuterVolumeSpecName: "config-data") pod "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" (UID: "812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.192139 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.192197 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.192214 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.192225 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.192240 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56bzp\" (UniqueName: \"kubernetes.io/projected/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac-kube-api-access-56bzp\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.717829 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" path="/var/lib/kubelet/pods/0ef9316d-eaf8-48a3-8172-cd81f3ebab38/volumes" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.719405 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" path="/var/lib/kubelet/pods/978210ad-7b9d-4cfb-9248-5ee38ebbe489/volumes" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.720590 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" path="/var/lib/kubelet/pods/a74a24f0-1a47-4ec0-9c06-4362768ae4ee/volumes" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.813177 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac","Type":"ContainerDied","Data":"8087fe00657dce92b454eaf335a0eeea37cf802143be5c43fc793e1a50601b10"} Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.813223 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.813290 4703 scope.go:117] "RemoveContainer" containerID="1603c6b8c50178a5d0277808e76141a21e957f4317d8a8f443937691d07464b9" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.840606 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.842259 4703 scope.go:117] "RemoveContainer" containerID="8b8af184830a6b6048951b0b80db23b040fa32b65bbb3e0110fed7ce7ccc7cdf" Mar 09 13:42:54 crc kubenswrapper[4703]: I0309 13:42:54.851363 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.417594 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-db-create-d9tx7"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.424681 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-db-create-d9tx7"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.449209 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-829d-account-create-update-zftrv"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.455005 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila829d-account-delete-4m64p"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.459398 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-829d-account-create-update-zftrv"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.468166 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila829d-account-delete-4m64p"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.600434 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-db-create-f4lks"] Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.600792 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.600823 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.600840 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.600876 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.600891 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.600901 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.600916 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.600928 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.600947 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.600957 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.600972 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api-log" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.600983 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api-log" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.600999 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="manila-share" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601010 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="manila-share" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.601026 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601036 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.601049 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601059 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.601080 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601089 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: E0309 13:42:55.601104 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3fcb56-0a55-43e9-93d9-124fc2269d75" containerName="mariadb-account-delete" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601114 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3fcb56-0a55-43e9-93d9-124fc2269d75" containerName="mariadb-account-delete" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601292 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601313 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601331 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601343 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601357 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601373 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3fcb56-0a55-43e9-93d9-124fc2269d75" containerName="mariadb-account-delete" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601386 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef9316d-eaf8-48a3-8172-cd81f3ebab38" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601402 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" containerName="manila-api-log" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601419 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="978210ad-7b9d-4cfb-9248-5ee38ebbe489" containerName="probe" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601430 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74a24f0-1a47-4ec0-9c06-4362768ae4ee" containerName="manila-scheduler" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.601445 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e06cecc-5443-47b2-8936-85ac055ea57f" containerName="manila-share" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.602082 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.607483 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-create-f4lks"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.628496 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-d652-account-create-update-f4rqg"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.630357 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.632838 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-db-secret" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.642121 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-d652-account-create-update-f4rqg"] Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.717546 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/6412a67b-65cf-4e02-801b-10cc7473496a-kube-api-access-fbtkh\") pod \"manila-db-create-f4lks\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.718191 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6412a67b-65cf-4e02-801b-10cc7473496a-operator-scripts\") pod \"manila-db-create-f4lks\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.820655 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mldxw\" (UniqueName: \"kubernetes.io/projected/85cb844c-e76f-4c27-a213-30185eeffd4f-kube-api-access-mldxw\") pod \"manila-d652-account-create-update-f4rqg\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.820721 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/6412a67b-65cf-4e02-801b-10cc7473496a-kube-api-access-fbtkh\") pod \"manila-db-create-f4lks\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.820741 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cb844c-e76f-4c27-a213-30185eeffd4f-operator-scripts\") pod \"manila-d652-account-create-update-f4rqg\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.820763 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6412a67b-65cf-4e02-801b-10cc7473496a-operator-scripts\") pod \"manila-db-create-f4lks\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.821624 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6412a67b-65cf-4e02-801b-10cc7473496a-operator-scripts\") pod \"manila-db-create-f4lks\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.844593 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/6412a67b-65cf-4e02-801b-10cc7473496a-kube-api-access-fbtkh\") pod \"manila-db-create-f4lks\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.922601 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mldxw\" (UniqueName: \"kubernetes.io/projected/85cb844c-e76f-4c27-a213-30185eeffd4f-kube-api-access-mldxw\") pod \"manila-d652-account-create-update-f4rqg\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.922708 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cb844c-e76f-4c27-a213-30185eeffd4f-operator-scripts\") pod \"manila-d652-account-create-update-f4rqg\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.923969 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cb844c-e76f-4c27-a213-30185eeffd4f-operator-scripts\") pod \"manila-d652-account-create-update-f4rqg\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.930267 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.945020 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mldxw\" (UniqueName: \"kubernetes.io/projected/85cb844c-e76f-4c27-a213-30185eeffd4f-kube-api-access-mldxw\") pod \"manila-d652-account-create-update-f4rqg\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:55 crc kubenswrapper[4703]: I0309 13:42:55.951331 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.215472 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-d652-account-create-update-f4rqg"] Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.347692 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-create-f4lks"] Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.717760 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a35046-9fb7-49ec-86d9-1fdc270b2cb7" path="/var/lib/kubelet/pods/23a35046-9fb7-49ec-86d9-1fdc270b2cb7/volumes" Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.718891 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec56df1-32a2-428a-98a3-4e824aaca5be" path="/var/lib/kubelet/pods/7ec56df1-32a2-428a-98a3-4e824aaca5be/volumes" Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.719681 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac" path="/var/lib/kubelet/pods/812c8f6e-a1ed-40b8-9aa9-7d6a637ecfac/volumes" Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.721227 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd3fcb56-0a55-43e9-93d9-124fc2269d75" path="/var/lib/kubelet/pods/cd3fcb56-0a55-43e9-93d9-124fc2269d75/volumes" Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.876484 4703 generic.go:334] "Generic (PLEG): container finished" podID="6412a67b-65cf-4e02-801b-10cc7473496a" containerID="126677a845daf3c303e8c68d80efda701d45d16d2ebb59c8fc6c289fcfa45845" exitCode=0 Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.876558 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-f4lks" event={"ID":"6412a67b-65cf-4e02-801b-10cc7473496a","Type":"ContainerDied","Data":"126677a845daf3c303e8c68d80efda701d45d16d2ebb59c8fc6c289fcfa45845"} Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.877208 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-f4lks" event={"ID":"6412a67b-65cf-4e02-801b-10cc7473496a","Type":"ContainerStarted","Data":"806bff8a526e393ede5f03f3f22609938977700f7fadedda9e5001970ba42adf"} Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.879542 4703 generic.go:334] "Generic (PLEG): container finished" podID="85cb844c-e76f-4c27-a213-30185eeffd4f" containerID="b66fea10bd1ffe1d6b60c1efc601c72dc5945d496166a9d70907cfc93cddabb3" exitCode=0 Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.879605 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" event={"ID":"85cb844c-e76f-4c27-a213-30185eeffd4f","Type":"ContainerDied","Data":"b66fea10bd1ffe1d6b60c1efc601c72dc5945d496166a9d70907cfc93cddabb3"} Mar 09 13:42:56 crc kubenswrapper[4703]: I0309 13:42:56.879641 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" event={"ID":"85cb844c-e76f-4c27-a213-30185eeffd4f","Type":"ContainerStarted","Data":"d9ac64638706ee71f7d6966f2dea08b7d45350789aa8d918476faa8dc1297ab9"} Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.308069 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.312574 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.487507 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6412a67b-65cf-4e02-801b-10cc7473496a-operator-scripts\") pod \"6412a67b-65cf-4e02-801b-10cc7473496a\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.487565 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/6412a67b-65cf-4e02-801b-10cc7473496a-kube-api-access-fbtkh\") pod \"6412a67b-65cf-4e02-801b-10cc7473496a\" (UID: \"6412a67b-65cf-4e02-801b-10cc7473496a\") " Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.487627 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cb844c-e76f-4c27-a213-30185eeffd4f-operator-scripts\") pod \"85cb844c-e76f-4c27-a213-30185eeffd4f\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.487680 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mldxw\" (UniqueName: \"kubernetes.io/projected/85cb844c-e76f-4c27-a213-30185eeffd4f-kube-api-access-mldxw\") pod \"85cb844c-e76f-4c27-a213-30185eeffd4f\" (UID: \"85cb844c-e76f-4c27-a213-30185eeffd4f\") " Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.488703 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85cb844c-e76f-4c27-a213-30185eeffd4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85cb844c-e76f-4c27-a213-30185eeffd4f" (UID: "85cb844c-e76f-4c27-a213-30185eeffd4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.488816 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6412a67b-65cf-4e02-801b-10cc7473496a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6412a67b-65cf-4e02-801b-10cc7473496a" (UID: "6412a67b-65cf-4e02-801b-10cc7473496a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.496187 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6412a67b-65cf-4e02-801b-10cc7473496a-kube-api-access-fbtkh" (OuterVolumeSpecName: "kube-api-access-fbtkh") pod "6412a67b-65cf-4e02-801b-10cc7473496a" (UID: "6412a67b-65cf-4e02-801b-10cc7473496a"). InnerVolumeSpecName "kube-api-access-fbtkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.496391 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85cb844c-e76f-4c27-a213-30185eeffd4f-kube-api-access-mldxw" (OuterVolumeSpecName: "kube-api-access-mldxw") pod "85cb844c-e76f-4c27-a213-30185eeffd4f" (UID: "85cb844c-e76f-4c27-a213-30185eeffd4f"). InnerVolumeSpecName "kube-api-access-mldxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.589321 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6412a67b-65cf-4e02-801b-10cc7473496a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.589376 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/6412a67b-65cf-4e02-801b-10cc7473496a-kube-api-access-fbtkh\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.589396 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cb844c-e76f-4c27-a213-30185eeffd4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.589414 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mldxw\" (UniqueName: \"kubernetes.io/projected/85cb844c-e76f-4c27-a213-30185eeffd4f-kube-api-access-mldxw\") on node \"crc\" DevicePath \"\"" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.898699 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-f4lks" event={"ID":"6412a67b-65cf-4e02-801b-10cc7473496a","Type":"ContainerDied","Data":"806bff8a526e393ede5f03f3f22609938977700f7fadedda9e5001970ba42adf"} Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.899240 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806bff8a526e393ede5f03f3f22609938977700f7fadedda9e5001970ba42adf" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.899165 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-f4lks" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.901386 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" event={"ID":"85cb844c-e76f-4c27-a213-30185eeffd4f","Type":"ContainerDied","Data":"d9ac64638706ee71f7d6966f2dea08b7d45350789aa8d918476faa8dc1297ab9"} Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.901445 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9ac64638706ee71f7d6966f2dea08b7d45350789aa8d918476faa8dc1297ab9" Mar 09 13:42:58 crc kubenswrapper[4703]: I0309 13:42:58.901768 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-d652-account-create-update-f4rqg" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.853828 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-db-sync-574fb"] Mar 09 13:43:00 crc kubenswrapper[4703]: E0309 13:43:00.855701 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6412a67b-65cf-4e02-801b-10cc7473496a" containerName="mariadb-database-create" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.855825 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="6412a67b-65cf-4e02-801b-10cc7473496a" containerName="mariadb-database-create" Mar 09 13:43:00 crc kubenswrapper[4703]: E0309 13:43:00.855982 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85cb844c-e76f-4c27-a213-30185eeffd4f" containerName="mariadb-account-create-update" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.856080 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="85cb844c-e76f-4c27-a213-30185eeffd4f" containerName="mariadb-account-create-update" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.856387 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="6412a67b-65cf-4e02-801b-10cc7473496a" containerName="mariadb-database-create" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.856528 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="85cb844c-e76f-4c27-a213-30185eeffd4f" containerName="mariadb-account-create-update" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.857420 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.861256 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-sync-574fb"] Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.872938 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-config-data" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.873041 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-manila-dockercfg-xfwpj" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.925695 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwdkd\" (UniqueName: \"kubernetes.io/projected/808476d5-bf47-48a4-8735-4e89f4cac1f8-kube-api-access-nwdkd\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.925813 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-job-config-data\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:00 crc kubenswrapper[4703]: I0309 13:43:00.925877 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-config-data\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:01 crc kubenswrapper[4703]: I0309 13:43:01.026805 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwdkd\" (UniqueName: \"kubernetes.io/projected/808476d5-bf47-48a4-8735-4e89f4cac1f8-kube-api-access-nwdkd\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:01 crc kubenswrapper[4703]: I0309 13:43:01.027226 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-job-config-data\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:01 crc kubenswrapper[4703]: I0309 13:43:01.027406 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-config-data\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:01 crc kubenswrapper[4703]: I0309 13:43:01.034072 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-job-config-data\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:01 crc kubenswrapper[4703]: I0309 13:43:01.037228 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-config-data\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:01 crc kubenswrapper[4703]: I0309 13:43:01.053876 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwdkd\" (UniqueName: \"kubernetes.io/projected/808476d5-bf47-48a4-8735-4e89f4cac1f8-kube-api-access-nwdkd\") pod \"manila-db-sync-574fb\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:01 crc kubenswrapper[4703]: I0309 13:43:01.185374 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:02 crc kubenswrapper[4703]: I0309 13:43:01.615228 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-sync-574fb"] Mar 09 13:43:02 crc kubenswrapper[4703]: W0309 13:43:01.617458 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod808476d5_bf47_48a4_8735_4e89f4cac1f8.slice/crio-222178b2da9167e2471b4f24b5d784e5cec2f13e77210963dcaab4e1726b4d6f WatchSource:0}: Error finding container 222178b2da9167e2471b4f24b5d784e5cec2f13e77210963dcaab4e1726b4d6f: Status 404 returned error can't find the container with id 222178b2da9167e2471b4f24b5d784e5cec2f13e77210963dcaab4e1726b4d6f Mar 09 13:43:02 crc kubenswrapper[4703]: I0309 13:43:01.925476 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-574fb" event={"ID":"808476d5-bf47-48a4-8735-4e89f4cac1f8","Type":"ContainerStarted","Data":"222178b2da9167e2471b4f24b5d784e5cec2f13e77210963dcaab4e1726b4d6f"} Mar 09 13:43:02 crc kubenswrapper[4703]: I0309 13:43:02.938984 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-574fb" event={"ID":"808476d5-bf47-48a4-8735-4e89f4cac1f8","Type":"ContainerStarted","Data":"ea8c3282c4ddfd90584cc8d00c829b2761551af5a7aacdd845aa84e97f3514ee"} Mar 09 13:43:02 crc kubenswrapper[4703]: I0309 13:43:02.977732 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-db-sync-574fb" podStartSLOduration=2.977681868 podStartE2EDuration="2.977681868s" podCreationTimestamp="2026-03-09 13:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:02.97162279 +0000 UTC m=+1378.939038476" watchObservedRunningTime="2026-03-09 13:43:02.977681868 +0000 UTC m=+1378.945097584" Mar 09 13:43:03 crc kubenswrapper[4703]: I0309 13:43:03.950764 4703 generic.go:334] "Generic (PLEG): container finished" podID="808476d5-bf47-48a4-8735-4e89f4cac1f8" containerID="ea8c3282c4ddfd90584cc8d00c829b2761551af5a7aacdd845aa84e97f3514ee" exitCode=0 Mar 09 13:43:03 crc kubenswrapper[4703]: I0309 13:43:03.950830 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-574fb" event={"ID":"808476d5-bf47-48a4-8735-4e89f4cac1f8","Type":"ContainerDied","Data":"ea8c3282c4ddfd90584cc8d00c829b2761551af5a7aacdd845aa84e97f3514ee"} Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.284758 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.406162 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-config-data\") pod \"808476d5-bf47-48a4-8735-4e89f4cac1f8\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.406415 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-job-config-data\") pod \"808476d5-bf47-48a4-8735-4e89f4cac1f8\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.406595 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwdkd\" (UniqueName: \"kubernetes.io/projected/808476d5-bf47-48a4-8735-4e89f4cac1f8-kube-api-access-nwdkd\") pod \"808476d5-bf47-48a4-8735-4e89f4cac1f8\" (UID: \"808476d5-bf47-48a4-8735-4e89f4cac1f8\") " Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.410871 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "808476d5-bf47-48a4-8735-4e89f4cac1f8" (UID: "808476d5-bf47-48a4-8735-4e89f4cac1f8"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.411428 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808476d5-bf47-48a4-8735-4e89f4cac1f8-kube-api-access-nwdkd" (OuterVolumeSpecName: "kube-api-access-nwdkd") pod "808476d5-bf47-48a4-8735-4e89f4cac1f8" (UID: "808476d5-bf47-48a4-8735-4e89f4cac1f8"). InnerVolumeSpecName "kube-api-access-nwdkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.414247 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-config-data" (OuterVolumeSpecName: "config-data") pod "808476d5-bf47-48a4-8735-4e89f4cac1f8" (UID: "808476d5-bf47-48a4-8735-4e89f4cac1f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.508687 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.508735 4703 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/808476d5-bf47-48a4-8735-4e89f4cac1f8-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.508756 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwdkd\" (UniqueName: \"kubernetes.io/projected/808476d5-bf47-48a4-8735-4e89f4cac1f8-kube-api-access-nwdkd\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.975452 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-574fb" event={"ID":"808476d5-bf47-48a4-8735-4e89f4cac1f8","Type":"ContainerDied","Data":"222178b2da9167e2471b4f24b5d784e5cec2f13e77210963dcaab4e1726b4d6f"} Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.975513 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="222178b2da9167e2471b4f24b5d784e5cec2f13e77210963dcaab4e1726b4d6f" Mar 09 13:43:05 crc kubenswrapper[4703]: I0309 13:43:05.975531 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-574fb" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.241116 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:43:06 crc kubenswrapper[4703]: E0309 13:43:06.241442 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808476d5-bf47-48a4-8735-4e89f4cac1f8" containerName="manila-db-sync" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.241464 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="808476d5-bf47-48a4-8735-4e89f4cac1f8" containerName="manila-db-sync" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.241621 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="808476d5-bf47-48a4-8735-4e89f4cac1f8" containerName="manila-db-sync" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.242381 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.244657 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-scripts" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.249605 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-scheduler-config-data" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.249932 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-config-data" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.250300 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-manila-dockercfg-xfwpj" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.257797 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.323870 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae7bdd12-c835-4951-ac39-092d7ef30b56-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.324200 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lctp\" (UniqueName: \"kubernetes.io/projected/ae7bdd12-c835-4951-ac39-092d7ef30b56-kube-api-access-4lctp\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.324246 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.324285 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.324330 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-scripts\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.399089 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.400317 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.403512 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"ceph-conf-files" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.403824 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-share-share0-config-data" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.410198 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.425731 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae7bdd12-c835-4951-ac39-092d7ef30b56-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.425954 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lctp\" (UniqueName: \"kubernetes.io/projected/ae7bdd12-c835-4951-ac39-092d7ef30b56-kube-api-access-4lctp\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426027 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426119 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426180 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426246 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426333 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426404 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df48b\" (UniqueName: \"kubernetes.io/projected/e403e67c-1041-4639-9ec7-31609de1be4d-kube-api-access-df48b\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426469 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-scripts\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426577 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-scripts\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426660 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426725 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-ceph\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.426881 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae7bdd12-c835-4951-ac39-092d7ef30b56-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.449197 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.451707 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-scripts\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.452637 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.458963 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lctp\" (UniqueName: \"kubernetes.io/projected/ae7bdd12-c835-4951-ac39-092d7ef30b56-kube-api-access-4lctp\") pod \"manila-scheduler-0\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.483235 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.484685 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.489237 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-api-config-data" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.495283 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.528780 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-ceph\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.528915 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data-custom\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.528962 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a131c573-116e-49d5-925e-64f99d666b43-etc-machine-id\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.528993 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529020 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529054 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529078 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a131c573-116e-49d5-925e-64f99d666b43-logs\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529115 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df48b\" (UniqueName: \"kubernetes.io/projected/e403e67c-1041-4639-9ec7-31609de1be4d-kube-api-access-df48b\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529140 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-scripts\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529160 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-scripts\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529191 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt56f\" (UniqueName: \"kubernetes.io/projected/a131c573-116e-49d5-925e-64f99d666b43-kube-api-access-vt56f\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529198 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529214 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529342 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.529451 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.532779 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-scripts\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.533581 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.534053 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-ceph\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.534147 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.543586 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df48b\" (UniqueName: \"kubernetes.io/projected/e403e67c-1041-4639-9ec7-31609de1be4d-kube-api-access-df48b\") pod \"manila-share-share0-0\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.617623 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.632763 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data-custom\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.632820 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a131c573-116e-49d5-925e-64f99d666b43-etc-machine-id\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.632868 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a131c573-116e-49d5-925e-64f99d666b43-logs\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.632896 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-scripts\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.632923 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt56f\" (UniqueName: \"kubernetes.io/projected/a131c573-116e-49d5-925e-64f99d666b43-kube-api-access-vt56f\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.632941 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.634640 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a131c573-116e-49d5-925e-64f99d666b43-logs\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.634827 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a131c573-116e-49d5-925e-64f99d666b43-etc-machine-id\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.636468 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.637525 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-scripts\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.638202 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data-custom\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.650485 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt56f\" (UniqueName: \"kubernetes.io/projected/a131c573-116e-49d5-925e-64f99d666b43-kube-api-access-vt56f\") pod \"manila-api-0\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.716216 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.800333 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.942645 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:43:06 crc kubenswrapper[4703]: W0309 13:43:06.948740 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode403e67c_1041_4639_9ec7_31609de1be4d.slice/crio-27a99a0d10e7428c963ff0bd1567b47d1a4abd5e360957abd546bcafff9da25e WatchSource:0}: Error finding container 27a99a0d10e7428c963ff0bd1567b47d1a4abd5e360957abd546bcafff9da25e: Status 404 returned error can't find the container with id 27a99a0d10e7428c963ff0bd1567b47d1a4abd5e360957abd546bcafff9da25e Mar 09 13:43:06 crc kubenswrapper[4703]: I0309 13:43:06.990298 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"e403e67c-1041-4639-9ec7-31609de1be4d","Type":"ContainerStarted","Data":"27a99a0d10e7428c963ff0bd1567b47d1a4abd5e360957abd546bcafff9da25e"} Mar 09 13:43:07 crc kubenswrapper[4703]: I0309 13:43:07.057545 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:43:07 crc kubenswrapper[4703]: I0309 13:43:07.230347 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:43:07 crc kubenswrapper[4703]: W0309 13:43:07.234143 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda131c573_116e_49d5_925e_64f99d666b43.slice/crio-8c5ba1feab9f707b93b4039d4cce57d60156dd5ebd0e2fe2476ec347de2b0be4 WatchSource:0}: Error finding container 8c5ba1feab9f707b93b4039d4cce57d60156dd5ebd0e2fe2476ec347de2b0be4: Status 404 returned error can't find the container with id 8c5ba1feab9f707b93b4039d4cce57d60156dd5ebd0e2fe2476ec347de2b0be4 Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.000885 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"e403e67c-1041-4639-9ec7-31609de1be4d","Type":"ContainerStarted","Data":"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90"} Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.001539 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"e403e67c-1041-4639-9ec7-31609de1be4d","Type":"ContainerStarted","Data":"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a"} Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.004521 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"a131c573-116e-49d5-925e-64f99d666b43","Type":"ContainerStarted","Data":"3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31"} Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.004559 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"a131c573-116e-49d5-925e-64f99d666b43","Type":"ContainerStarted","Data":"8c5ba1feab9f707b93b4039d4cce57d60156dd5ebd0e2fe2476ec347de2b0be4"} Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.006593 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"ae7bdd12-c835-4951-ac39-092d7ef30b56","Type":"ContainerStarted","Data":"2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141"} Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.006613 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"ae7bdd12-c835-4951-ac39-092d7ef30b56","Type":"ContainerStarted","Data":"a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e"} Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.006622 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"ae7bdd12-c835-4951-ac39-092d7ef30b56","Type":"ContainerStarted","Data":"377eace6ad10d6363c43a1d21e8a0f2da659617fc74653d9e7f5d81e374f784c"} Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.018597 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-share-share0-0" podStartSLOduration=2.018576667 podStartE2EDuration="2.018576667s" podCreationTimestamp="2026-03-09 13:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:08.016396987 +0000 UTC m=+1383.983812673" watchObservedRunningTime="2026-03-09 13:43:08.018576667 +0000 UTC m=+1383.985992353" Mar 09 13:43:08 crc kubenswrapper[4703]: I0309 13:43:08.040051 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-scheduler-0" podStartSLOduration=2.040026454 podStartE2EDuration="2.040026454s" podCreationTimestamp="2026-03-09 13:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:08.033111612 +0000 UTC m=+1384.000527298" watchObservedRunningTime="2026-03-09 13:43:08.040026454 +0000 UTC m=+1384.007442140" Mar 09 13:43:09 crc kubenswrapper[4703]: I0309 13:43:09.015911 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"a131c573-116e-49d5-925e-64f99d666b43","Type":"ContainerStarted","Data":"d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de"} Mar 09 13:43:09 crc kubenswrapper[4703]: I0309 13:43:09.035215 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-api-0" podStartSLOduration=3.035194318 podStartE2EDuration="3.035194318s" podCreationTimestamp="2026-03-09 13:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:09.034370365 +0000 UTC m=+1385.001786101" watchObservedRunningTime="2026-03-09 13:43:09.035194318 +0000 UTC m=+1385.002610014" Mar 09 13:43:09 crc kubenswrapper[4703]: I0309 13:43:09.500322 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:43:09 crc kubenswrapper[4703]: I0309 13:43:09.500412 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:43:09 crc kubenswrapper[4703]: I0309 13:43:09.500461 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:43:09 crc kubenswrapper[4703]: I0309 13:43:09.501084 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63593ee12ca38b373f82a991b81f46458f13b7e7d0525d9d0305c5a08fc34572"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:43:09 crc kubenswrapper[4703]: I0309 13:43:09.501143 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://63593ee12ca38b373f82a991b81f46458f13b7e7d0525d9d0305c5a08fc34572" gracePeriod=600 Mar 09 13:43:10 crc kubenswrapper[4703]: I0309 13:43:10.024326 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="63593ee12ca38b373f82a991b81f46458f13b7e7d0525d9d0305c5a08fc34572" exitCode=0 Mar 09 13:43:10 crc kubenswrapper[4703]: I0309 13:43:10.024520 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"63593ee12ca38b373f82a991b81f46458f13b7e7d0525d9d0305c5a08fc34572"} Mar 09 13:43:10 crc kubenswrapper[4703]: I0309 13:43:10.024665 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f"} Mar 09 13:43:10 crc kubenswrapper[4703]: I0309 13:43:10.024687 4703 scope.go:117] "RemoveContainer" containerID="10cd3ad358458f317f93eedfe061a2094205d9a36cfaa6732883cb518fdce4bc" Mar 09 13:43:10 crc kubenswrapper[4703]: I0309 13:43:10.025598 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:16 crc kubenswrapper[4703]: I0309 13:43:16.618647 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:16 crc kubenswrapper[4703]: I0309 13:43:16.718477 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:28 crc kubenswrapper[4703]: I0309 13:43:28.078309 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:28 crc kubenswrapper[4703]: I0309 13:43:28.087294 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:28 crc kubenswrapper[4703]: I0309 13:43:28.196355 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.428384 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-share-share1-0"] Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.430366 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.432505 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-share-share1-config-data" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.453003 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share1-0"] Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.482597 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.482703 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-scripts\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.482756 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6f8t\" (UniqueName: \"kubernetes.io/projected/5cb3c834-6c40-44eb-aef4-7df38d22afc8-kube-api-access-g6f8t\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.482799 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-ceph\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.482831 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.482920 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.482963 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.584581 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.584660 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-scripts\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.584721 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6f8t\" (UniqueName: \"kubernetes.io/projected/5cb3c834-6c40-44eb-aef4-7df38d22afc8-kube-api-access-g6f8t\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.585704 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-ceph\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.585744 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.585789 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.585819 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.585874 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.586083 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.593805 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-ceph\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.593805 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.594104 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.595181 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-scripts\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.607460 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6f8t\" (UniqueName: \"kubernetes.io/projected/5cb3c834-6c40-44eb-aef4-7df38d22afc8-kube-api-access-g6f8t\") pod \"manila-share-share1-0\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:29 crc kubenswrapper[4703]: I0309 13:43:29.755530 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:30 crc kubenswrapper[4703]: I0309 13:43:30.244728 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share1-0"] Mar 09 13:43:30 crc kubenswrapper[4703]: W0309 13:43:30.247428 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cb3c834_6c40_44eb_aef4_7df38d22afc8.slice/crio-db17930735bbed2ec5e34b311e9c72304f6b063a050c0dbfb7872a62a1d1456d WatchSource:0}: Error finding container db17930735bbed2ec5e34b311e9c72304f6b063a050c0dbfb7872a62a1d1456d: Status 404 returned error can't find the container with id db17930735bbed2ec5e34b311e9c72304f6b063a050c0dbfb7872a62a1d1456d Mar 09 13:43:31 crc kubenswrapper[4703]: I0309 13:43:31.213343 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share1-0" event={"ID":"5cb3c834-6c40-44eb-aef4-7df38d22afc8","Type":"ContainerStarted","Data":"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845"} Mar 09 13:43:31 crc kubenswrapper[4703]: I0309 13:43:31.213690 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share1-0" event={"ID":"5cb3c834-6c40-44eb-aef4-7df38d22afc8","Type":"ContainerStarted","Data":"db17930735bbed2ec5e34b311e9c72304f6b063a050c0dbfb7872a62a1d1456d"} Mar 09 13:43:32 crc kubenswrapper[4703]: I0309 13:43:32.222167 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share1-0" event={"ID":"5cb3c834-6c40-44eb-aef4-7df38d22afc8","Type":"ContainerStarted","Data":"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823"} Mar 09 13:43:32 crc kubenswrapper[4703]: I0309 13:43:32.242785 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-share-share1-0" podStartSLOduration=3.242767024 podStartE2EDuration="3.242767024s" podCreationTimestamp="2026-03-09 13:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:32.240721407 +0000 UTC m=+1408.208137103" watchObservedRunningTime="2026-03-09 13:43:32.242767024 +0000 UTC m=+1408.210182710" Mar 09 13:43:39 crc kubenswrapper[4703]: I0309 13:43:39.756005 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:51 crc kubenswrapper[4703]: I0309 13:43:51.323334 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:52 crc kubenswrapper[4703]: I0309 13:43:52.426295 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:43:52 crc kubenswrapper[4703]: I0309 13:43:52.426586 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share0-0" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="manila-share" containerID="cri-o://65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a" gracePeriod=30 Mar 09 13:43:52 crc kubenswrapper[4703]: I0309 13:43:52.426662 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share0-0" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="probe" containerID="cri-o://2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90" gracePeriod=30 Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.102169 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.259666 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df48b\" (UniqueName: \"kubernetes.io/projected/e403e67c-1041-4639-9ec7-31609de1be4d-kube-api-access-df48b\") pod \"e403e67c-1041-4639-9ec7-31609de1be4d\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.259736 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-scripts\") pod \"e403e67c-1041-4639-9ec7-31609de1be4d\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.259807 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-var-lib-manila\") pod \"e403e67c-1041-4639-9ec7-31609de1be4d\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.259904 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-etc-machine-id\") pod \"e403e67c-1041-4639-9ec7-31609de1be4d\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.259990 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data-custom\") pod \"e403e67c-1041-4639-9ec7-31609de1be4d\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.260070 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-ceph\") pod \"e403e67c-1041-4639-9ec7-31609de1be4d\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.260059 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "e403e67c-1041-4639-9ec7-31609de1be4d" (UID: "e403e67c-1041-4639-9ec7-31609de1be4d"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.260106 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data\") pod \"e403e67c-1041-4639-9ec7-31609de1be4d\" (UID: \"e403e67c-1041-4639-9ec7-31609de1be4d\") " Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.260715 4703 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.260069 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e403e67c-1041-4639-9ec7-31609de1be4d" (UID: "e403e67c-1041-4639-9ec7-31609de1be4d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.265957 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e403e67c-1041-4639-9ec7-31609de1be4d" (UID: "e403e67c-1041-4639-9ec7-31609de1be4d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.272332 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-scripts" (OuterVolumeSpecName: "scripts") pod "e403e67c-1041-4639-9ec7-31609de1be4d" (UID: "e403e67c-1041-4639-9ec7-31609de1be4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.275174 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e403e67c-1041-4639-9ec7-31609de1be4d-kube-api-access-df48b" (OuterVolumeSpecName: "kube-api-access-df48b") pod "e403e67c-1041-4639-9ec7-31609de1be4d" (UID: "e403e67c-1041-4639-9ec7-31609de1be4d"). InnerVolumeSpecName "kube-api-access-df48b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.277578 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-ceph" (OuterVolumeSpecName: "ceph") pod "e403e67c-1041-4639-9ec7-31609de1be4d" (UID: "e403e67c-1041-4639-9ec7-31609de1be4d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.362759 4703 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.362860 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df48b\" (UniqueName: \"kubernetes.io/projected/e403e67c-1041-4639-9ec7-31609de1be4d-kube-api-access-df48b\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.362875 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.362889 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e403e67c-1041-4639-9ec7-31609de1be4d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.362905 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.368152 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data" (OuterVolumeSpecName: "config-data") pod "e403e67c-1041-4639-9ec7-31609de1be4d" (UID: "e403e67c-1041-4639-9ec7-31609de1be4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.417347 4703 generic.go:334] "Generic (PLEG): container finished" podID="e403e67c-1041-4639-9ec7-31609de1be4d" containerID="2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90" exitCode=0 Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.417598 4703 generic.go:334] "Generic (PLEG): container finished" podID="e403e67c-1041-4639-9ec7-31609de1be4d" containerID="65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a" exitCode=1 Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.417393 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"e403e67c-1041-4639-9ec7-31609de1be4d","Type":"ContainerDied","Data":"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90"} Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.417404 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.417633 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"e403e67c-1041-4639-9ec7-31609de1be4d","Type":"ContainerDied","Data":"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a"} Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.417647 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"e403e67c-1041-4639-9ec7-31609de1be4d","Type":"ContainerDied","Data":"27a99a0d10e7428c963ff0bd1567b47d1a4abd5e360957abd546bcafff9da25e"} Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.417660 4703 scope.go:117] "RemoveContainer" containerID="2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.446876 4703 scope.go:117] "RemoveContainer" containerID="65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.463728 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e403e67c-1041-4639-9ec7-31609de1be4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.473647 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.477952 4703 scope.go:117] "RemoveContainer" containerID="2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90" Mar 09 13:43:53 crc kubenswrapper[4703]: E0309 13:43:53.478970 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90\": container with ID starting with 2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90 not found: ID does not exist" containerID="2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.479039 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90"} err="failed to get container status \"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90\": rpc error: code = NotFound desc = could not find container \"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90\": container with ID starting with 2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90 not found: ID does not exist" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.479076 4703 scope.go:117] "RemoveContainer" containerID="65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a" Mar 09 13:43:53 crc kubenswrapper[4703]: E0309 13:43:53.479690 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a\": container with ID starting with 65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a not found: ID does not exist" containerID="65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.479751 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a"} err="failed to get container status \"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a\": rpc error: code = NotFound desc = could not find container \"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a\": container with ID starting with 65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a not found: ID does not exist" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.479790 4703 scope.go:117] "RemoveContainer" containerID="2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.480261 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90"} err="failed to get container status \"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90\": rpc error: code = NotFound desc = could not find container \"2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90\": container with ID starting with 2ca09bca95e7bc49adb1ef70fbaccc0c0f394c16c7ed2d36a9e7ba820378bf90 not found: ID does not exist" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.480321 4703 scope.go:117] "RemoveContainer" containerID="65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.480692 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a"} err="failed to get container status \"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a\": rpc error: code = NotFound desc = could not find container \"65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a\": container with ID starting with 65fe9d44f00b0a37b7f2266eb5eb6dec268cf4c7555e1d649ced7070078a8e5a not found: ID does not exist" Mar 09 13:43:53 crc kubenswrapper[4703]: I0309 13:43:53.481335 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.714314 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" path="/var/lib/kubelet/pods/e403e67c-1041-4639-9ec7-31609de1be4d/volumes" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.764891 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg"] Mar 09 13:43:54 crc kubenswrapper[4703]: E0309 13:43:54.765283 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="manila-share" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.765311 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="manila-share" Mar 09 13:43:54 crc kubenswrapper[4703]: E0309 13:43:54.765341 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="probe" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.765354 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="probe" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.765565 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="manila-share" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.765635 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e403e67c-1041-4639-9ec7-31609de1be4d" containerName="probe" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.766353 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.775605 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg"] Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.887270 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhskg\" (UniqueName: \"kubernetes.io/projected/75236b41-aeed-410a-8afd-f3a360b158d6-kube-api-access-mhskg\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.887455 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-config-data\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.887504 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-job-config-data\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.989127 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhskg\" (UniqueName: \"kubernetes.io/projected/75236b41-aeed-410a-8afd-f3a360b158d6-kube-api-access-mhskg\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.989278 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-config-data\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:54 crc kubenswrapper[4703]: I0309 13:43:54.989341 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-job-config-data\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.002008 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-job-config-data\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.012657 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-config-data\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.014901 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhskg\" (UniqueName: \"kubernetes.io/projected/75236b41-aeed-410a-8afd-f3a360b158d6-kube-api-access-mhskg\") pod \"manila-service-cleanup-n5b5h655-dm5zg\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.081691 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.382133 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-db-sync-574fb"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.392197 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-db-sync-574fb"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.398298 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.425227 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.425508 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-0" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="manila-scheduler" containerID="cri-o://a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e" gracePeriod=30 Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.425554 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-0" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="probe" containerID="cri-o://2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141" gracePeriod=30 Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.433920 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share1-0"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.434346 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share1-0" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="probe" containerID="cri-o://0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823" gracePeriod=30 Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.434320 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share1-0" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="manila-share" containerID="cri-o://d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845" gracePeriod=30 Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.454190 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manilad652-account-delete-d22tz"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.455176 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.476085 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.476402 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-0" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api-log" containerID="cri-o://3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31" gracePeriod=30 Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.476551 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-0" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api" containerID="cri-o://d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de" gracePeriod=30 Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.487707 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manilad652-account-delete-d22tz"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.520069 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44905e68-3339-46ca-bb60-82a621d35455-operator-scripts\") pod \"manilad652-account-delete-d22tz\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.520126 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw5bt\" (UniqueName: \"kubernetes.io/projected/44905e68-3339-46ca-bb60-82a621d35455-kube-api-access-bw5bt\") pod \"manilad652-account-delete-d22tz\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.542500 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg"] Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.621115 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44905e68-3339-46ca-bb60-82a621d35455-operator-scripts\") pod \"manilad652-account-delete-d22tz\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.621154 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw5bt\" (UniqueName: \"kubernetes.io/projected/44905e68-3339-46ca-bb60-82a621d35455-kube-api-access-bw5bt\") pod \"manilad652-account-delete-d22tz\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.621883 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44905e68-3339-46ca-bb60-82a621d35455-operator-scripts\") pod \"manilad652-account-delete-d22tz\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.638399 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw5bt\" (UniqueName: \"kubernetes.io/projected/44905e68-3339-46ca-bb60-82a621d35455-kube-api-access-bw5bt\") pod \"manilad652-account-delete-d22tz\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:55 crc kubenswrapper[4703]: I0309 13:43:55.780158 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.216794 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manilad652-account-delete-d22tz"] Mar 09 13:43:56 crc kubenswrapper[4703]: W0309 13:43:56.236241 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44905e68_3339_46ca_bb60_82a621d35455.slice/crio-8b4ecae6b5e65f7880b35d411bcb26154a7dea443dbff7ad9823a443744d4691 WatchSource:0}: Error finding container 8b4ecae6b5e65f7880b35d411bcb26154a7dea443dbff7ad9823a443744d4691: Status 404 returned error can't find the container with id 8b4ecae6b5e65f7880b35d411bcb26154a7dea443dbff7ad9823a443744d4691 Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.317654 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.430377 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-scripts\") pod \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.430655 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data\") pod \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.430876 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6f8t\" (UniqueName: \"kubernetes.io/projected/5cb3c834-6c40-44eb-aef4-7df38d22afc8-kube-api-access-g6f8t\") pod \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.430984 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-var-lib-manila\") pod \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.431071 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data-custom\") pod \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.431235 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-ceph\") pod \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.431453 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-etc-machine-id\") pod \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\" (UID: \"5cb3c834-6c40-44eb-aef4-7df38d22afc8\") " Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.431119 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "5cb3c834-6c40-44eb-aef4-7df38d22afc8" (UID: "5cb3c834-6c40-44eb-aef4-7df38d22afc8"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.431541 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5cb3c834-6c40-44eb-aef4-7df38d22afc8" (UID: "5cb3c834-6c40-44eb-aef4-7df38d22afc8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.432118 4703 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.432377 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cb3c834-6c40-44eb-aef4-7df38d22afc8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.437025 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-ceph" (OuterVolumeSpecName: "ceph") pod "5cb3c834-6c40-44eb-aef4-7df38d22afc8" (UID: "5cb3c834-6c40-44eb-aef4-7df38d22afc8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.438784 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb3c834-6c40-44eb-aef4-7df38d22afc8-kube-api-access-g6f8t" (OuterVolumeSpecName: "kube-api-access-g6f8t") pod "5cb3c834-6c40-44eb-aef4-7df38d22afc8" (UID: "5cb3c834-6c40-44eb-aef4-7df38d22afc8"). InnerVolumeSpecName "kube-api-access-g6f8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.438978 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5cb3c834-6c40-44eb-aef4-7df38d22afc8" (UID: "5cb3c834-6c40-44eb-aef4-7df38d22afc8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.440321 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-scripts" (OuterVolumeSpecName: "scripts") pod "5cb3c834-6c40-44eb-aef4-7df38d22afc8" (UID: "5cb3c834-6c40-44eb-aef4-7df38d22afc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.462996 4703 generic.go:334] "Generic (PLEG): container finished" podID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerID="0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823" exitCode=0 Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.463040 4703 generic.go:334] "Generic (PLEG): container finished" podID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerID="d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845" exitCode=1 Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.463105 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share1-0" event={"ID":"5cb3c834-6c40-44eb-aef4-7df38d22afc8","Type":"ContainerDied","Data":"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.463133 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share1-0" event={"ID":"5cb3c834-6c40-44eb-aef4-7df38d22afc8","Type":"ContainerDied","Data":"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.463142 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share1-0" event={"ID":"5cb3c834-6c40-44eb-aef4-7df38d22afc8","Type":"ContainerDied","Data":"db17930735bbed2ec5e34b311e9c72304f6b063a050c0dbfb7872a62a1d1456d"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.463160 4703 scope.go:117] "RemoveContainer" containerID="0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.464502 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share1-0" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.464603 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" event={"ID":"75236b41-aeed-410a-8afd-f3a360b158d6","Type":"ContainerStarted","Data":"0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.464808 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" event={"ID":"75236b41-aeed-410a-8afd-f3a360b158d6","Type":"ContainerStarted","Data":"1d5a468bbef9a494213fbbd85602c17fb628d1a526cf689cce42b1920d39fa8e"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.465401 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" podUID="75236b41-aeed-410a-8afd-f3a360b158d6" containerName="manila-service-cleanup-n5b5h655" containerID="cri-o://0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583" gracePeriod=30 Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.465697 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" event={"ID":"44905e68-3339-46ca-bb60-82a621d35455","Type":"ContainerStarted","Data":"56a38f020f216e904effa92db314b3cb02413f88b655bf72930151beddc3a5d7"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.465803 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" event={"ID":"44905e68-3339-46ca-bb60-82a621d35455","Type":"ContainerStarted","Data":"8b4ecae6b5e65f7880b35d411bcb26154a7dea443dbff7ad9823a443744d4691"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.478866 4703 generic.go:334] "Generic (PLEG): container finished" podID="a131c573-116e-49d5-925e-64f99d666b43" containerID="3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31" exitCode=143 Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.478959 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"a131c573-116e-49d5-925e-64f99d666b43","Type":"ContainerDied","Data":"3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.486771 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" podStartSLOduration=2.486752363 podStartE2EDuration="2.486752363s" podCreationTimestamp="2026-03-09 13:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:56.483192396 +0000 UTC m=+1432.450608102" watchObservedRunningTime="2026-03-09 13:43:56.486752363 +0000 UTC m=+1432.454168049" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.493692 4703 scope.go:117] "RemoveContainer" containerID="d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.494279 4703 generic.go:334] "Generic (PLEG): container finished" podID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerID="2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141" exitCode=0 Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.494242 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"ae7bdd12-c835-4951-ac39-092d7ef30b56","Type":"ContainerDied","Data":"2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141"} Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.513823 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" podStartSLOduration=1.513802182 podStartE2EDuration="1.513802182s" podCreationTimestamp="2026-03-09 13:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:43:56.503213513 +0000 UTC m=+1432.470629199" watchObservedRunningTime="2026-03-09 13:43:56.513802182 +0000 UTC m=+1432.481217878" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.514285 4703 scope.go:117] "RemoveContainer" containerID="0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823" Mar 09 13:43:56 crc kubenswrapper[4703]: E0309 13:43:56.517372 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823\": container with ID starting with 0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823 not found: ID does not exist" containerID="0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.517430 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823"} err="failed to get container status \"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823\": rpc error: code = NotFound desc = could not find container \"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823\": container with ID starting with 0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823 not found: ID does not exist" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.517458 4703 scope.go:117] "RemoveContainer" containerID="d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845" Mar 09 13:43:56 crc kubenswrapper[4703]: E0309 13:43:56.517979 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845\": container with ID starting with d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845 not found: ID does not exist" containerID="d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.518033 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845"} err="failed to get container status \"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845\": rpc error: code = NotFound desc = could not find container \"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845\": container with ID starting with d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845 not found: ID does not exist" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.518062 4703 scope.go:117] "RemoveContainer" containerID="0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.520553 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823"} err="failed to get container status \"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823\": rpc error: code = NotFound desc = could not find container \"0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823\": container with ID starting with 0c9dd10691f9634dad59a1505caea7f4e1eccf5113e785e394872fb2eae2f823 not found: ID does not exist" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.520613 4703 scope.go:117] "RemoveContainer" containerID="d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.521071 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845"} err="failed to get container status \"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845\": rpc error: code = NotFound desc = could not find container \"d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845\": container with ID starting with d628886ffcbaf905642ae6cfb92c9ae20f6aba9812afcefc7d329416b047a845 not found: ID does not exist" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.533511 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.533551 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.533567 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6f8t\" (UniqueName: \"kubernetes.io/projected/5cb3c834-6c40-44eb-aef4-7df38d22afc8-kube-api-access-g6f8t\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.533580 4703 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.536486 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data" (OuterVolumeSpecName: "config-data") pod "5cb3c834-6c40-44eb-aef4-7df38d22afc8" (UID: "5cb3c834-6c40-44eb-aef4-7df38d22afc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.635044 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb3c834-6c40-44eb-aef4-7df38d22afc8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.715944 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808476d5-bf47-48a4-8735-4e89f4cac1f8" path="/var/lib/kubelet/pods/808476d5-bf47-48a4-8735-4e89f4cac1f8/volumes" Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.791193 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share1-0"] Mar 09 13:43:56 crc kubenswrapper[4703]: I0309 13:43:56.795364 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-share-share1-0"] Mar 09 13:43:57 crc kubenswrapper[4703]: I0309 13:43:57.509625 4703 generic.go:334] "Generic (PLEG): container finished" podID="44905e68-3339-46ca-bb60-82a621d35455" containerID="56a38f020f216e904effa92db314b3cb02413f88b655bf72930151beddc3a5d7" exitCode=0 Mar 09 13:43:57 crc kubenswrapper[4703]: I0309 13:43:57.509744 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" event={"ID":"44905e68-3339-46ca-bb60-82a621d35455","Type":"ContainerDied","Data":"56a38f020f216e904effa92db314b3cb02413f88b655bf72930151beddc3a5d7"} Mar 09 13:43:58 crc kubenswrapper[4703]: I0309 13:43:58.626543 4703 prober.go:107] "Probe failed" probeType="Readiness" pod="manila-kuttl-tests/manila-api-0" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api" probeResult="failure" output="Get \"http://10.217.0.106:8786/healthcheck\": read tcp 10.217.0.2:42978->10.217.0.106:8786: read: connection reset by peer" Mar 09 13:43:58 crc kubenswrapper[4703]: I0309 13:43:58.716436 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" path="/var/lib/kubelet/pods/5cb3c834-6c40-44eb-aef4-7df38d22afc8/volumes" Mar 09 13:43:58 crc kubenswrapper[4703]: I0309 13:43:58.877120 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.071743 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44905e68-3339-46ca-bb60-82a621d35455-operator-scripts\") pod \"44905e68-3339-46ca-bb60-82a621d35455\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.072234 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw5bt\" (UniqueName: \"kubernetes.io/projected/44905e68-3339-46ca-bb60-82a621d35455-kube-api-access-bw5bt\") pod \"44905e68-3339-46ca-bb60-82a621d35455\" (UID: \"44905e68-3339-46ca-bb60-82a621d35455\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.072635 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44905e68-3339-46ca-bb60-82a621d35455-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44905e68-3339-46ca-bb60-82a621d35455" (UID: "44905e68-3339-46ca-bb60-82a621d35455"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.077227 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44905e68-3339-46ca-bb60-82a621d35455-kube-api-access-bw5bt" (OuterVolumeSpecName: "kube-api-access-bw5bt") pod "44905e68-3339-46ca-bb60-82a621d35455" (UID: "44905e68-3339-46ca-bb60-82a621d35455"). InnerVolumeSpecName "kube-api-access-bw5bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.079516 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.084566 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.173888 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a131c573-116e-49d5-925e-64f99d666b43-logs\") pod \"a131c573-116e-49d5-925e-64f99d666b43\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.174361 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a131c573-116e-49d5-925e-64f99d666b43-etc-machine-id\") pod \"a131c573-116e-49d5-925e-64f99d666b43\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.174487 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a131c573-116e-49d5-925e-64f99d666b43-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a131c573-116e-49d5-925e-64f99d666b43" (UID: "a131c573-116e-49d5-925e-64f99d666b43"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.174535 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a131c573-116e-49d5-925e-64f99d666b43-logs" (OuterVolumeSpecName: "logs") pod "a131c573-116e-49d5-925e-64f99d666b43" (UID: "a131c573-116e-49d5-925e-64f99d666b43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.174640 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lctp\" (UniqueName: \"kubernetes.io/projected/ae7bdd12-c835-4951-ac39-092d7ef30b56-kube-api-access-4lctp\") pod \"ae7bdd12-c835-4951-ac39-092d7ef30b56\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.175126 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data\") pod \"ae7bdd12-c835-4951-ac39-092d7ef30b56\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.175510 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt56f\" (UniqueName: \"kubernetes.io/projected/a131c573-116e-49d5-925e-64f99d666b43-kube-api-access-vt56f\") pod \"a131c573-116e-49d5-925e-64f99d666b43\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.175670 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data-custom\") pod \"ae7bdd12-c835-4951-ac39-092d7ef30b56\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.175898 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data\") pod \"a131c573-116e-49d5-925e-64f99d666b43\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.176031 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-scripts\") pod \"a131c573-116e-49d5-925e-64f99d666b43\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.176163 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-scripts\") pod \"ae7bdd12-c835-4951-ac39-092d7ef30b56\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.176355 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data-custom\") pod \"a131c573-116e-49d5-925e-64f99d666b43\" (UID: \"a131c573-116e-49d5-925e-64f99d666b43\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.176442 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae7bdd12-c835-4951-ac39-092d7ef30b56-etc-machine-id\") pod \"ae7bdd12-c835-4951-ac39-092d7ef30b56\" (UID: \"ae7bdd12-c835-4951-ac39-092d7ef30b56\") " Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.176487 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae7bdd12-c835-4951-ac39-092d7ef30b56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ae7bdd12-c835-4951-ac39-092d7ef30b56" (UID: "ae7bdd12-c835-4951-ac39-092d7ef30b56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.176940 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae7bdd12-c835-4951-ac39-092d7ef30b56-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.177046 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a131c573-116e-49d5-925e-64f99d666b43-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.177120 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a131c573-116e-49d5-925e-64f99d666b43-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.177360 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44905e68-3339-46ca-bb60-82a621d35455-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.177473 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw5bt\" (UniqueName: \"kubernetes.io/projected/44905e68-3339-46ca-bb60-82a621d35455-kube-api-access-bw5bt\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.178833 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ae7bdd12-c835-4951-ac39-092d7ef30b56" (UID: "ae7bdd12-c835-4951-ac39-092d7ef30b56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.179396 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-scripts" (OuterVolumeSpecName: "scripts") pod "a131c573-116e-49d5-925e-64f99d666b43" (UID: "a131c573-116e-49d5-925e-64f99d666b43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.179427 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a131c573-116e-49d5-925e-64f99d666b43" (UID: "a131c573-116e-49d5-925e-64f99d666b43"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.179520 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7bdd12-c835-4951-ac39-092d7ef30b56-kube-api-access-4lctp" (OuterVolumeSpecName: "kube-api-access-4lctp") pod "ae7bdd12-c835-4951-ac39-092d7ef30b56" (UID: "ae7bdd12-c835-4951-ac39-092d7ef30b56"). InnerVolumeSpecName "kube-api-access-4lctp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.179640 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-scripts" (OuterVolumeSpecName: "scripts") pod "ae7bdd12-c835-4951-ac39-092d7ef30b56" (UID: "ae7bdd12-c835-4951-ac39-092d7ef30b56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.180358 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a131c573-116e-49d5-925e-64f99d666b43-kube-api-access-vt56f" (OuterVolumeSpecName: "kube-api-access-vt56f") pod "a131c573-116e-49d5-925e-64f99d666b43" (UID: "a131c573-116e-49d5-925e-64f99d666b43"). InnerVolumeSpecName "kube-api-access-vt56f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.206372 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data" (OuterVolumeSpecName: "config-data") pod "a131c573-116e-49d5-925e-64f99d666b43" (UID: "a131c573-116e-49d5-925e-64f99d666b43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.228208 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data" (OuterVolumeSpecName: "config-data") pod "ae7bdd12-c835-4951-ac39-092d7ef30b56" (UID: "ae7bdd12-c835-4951-ac39-092d7ef30b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278326 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt56f\" (UniqueName: \"kubernetes.io/projected/a131c573-116e-49d5-925e-64f99d666b43-kube-api-access-vt56f\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278511 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278581 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278635 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278682 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278736 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a131c573-116e-49d5-925e-64f99d666b43-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278783 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lctp\" (UniqueName: \"kubernetes.io/projected/ae7bdd12-c835-4951-ac39-092d7ef30b56-kube-api-access-4lctp\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.278829 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7bdd12-c835-4951-ac39-092d7ef30b56-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.536268 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.536289 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilad652-account-delete-d22tz" event={"ID":"44905e68-3339-46ca-bb60-82a621d35455","Type":"ContainerDied","Data":"8b4ecae6b5e65f7880b35d411bcb26154a7dea443dbff7ad9823a443744d4691"} Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.536338 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b4ecae6b5e65f7880b35d411bcb26154a7dea443dbff7ad9823a443744d4691" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.539408 4703 generic.go:334] "Generic (PLEG): container finished" podID="a131c573-116e-49d5-925e-64f99d666b43" containerID="d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de" exitCode=0 Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.539552 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.539721 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"a131c573-116e-49d5-925e-64f99d666b43","Type":"ContainerDied","Data":"d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de"} Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.539790 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"a131c573-116e-49d5-925e-64f99d666b43","Type":"ContainerDied","Data":"8c5ba1feab9f707b93b4039d4cce57d60156dd5ebd0e2fe2476ec347de2b0be4"} Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.539837 4703 scope.go:117] "RemoveContainer" containerID="d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.547218 4703 generic.go:334] "Generic (PLEG): container finished" podID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerID="a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e" exitCode=0 Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.547278 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"ae7bdd12-c835-4951-ac39-092d7ef30b56","Type":"ContainerDied","Data":"a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e"} Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.547346 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"ae7bdd12-c835-4951-ac39-092d7ef30b56","Type":"ContainerDied","Data":"377eace6ad10d6363c43a1d21e8a0f2da659617fc74653d9e7f5d81e374f784c"} Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.547440 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.592084 4703 scope.go:117] "RemoveContainer" containerID="3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.607070 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.611740 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.617465 4703 scope.go:117] "RemoveContainer" containerID="d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de" Mar 09 13:43:59 crc kubenswrapper[4703]: E0309 13:43:59.618014 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de\": container with ID starting with d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de not found: ID does not exist" containerID="d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.618054 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de"} err="failed to get container status \"d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de\": rpc error: code = NotFound desc = could not find container \"d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de\": container with ID starting with d0e73ee0d5db96d7248a8cb3098094c3a6a1ee812079fd12a8ff3fde3fc0d6de not found: ID does not exist" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.618078 4703 scope.go:117] "RemoveContainer" containerID="3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.618409 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:43:59 crc kubenswrapper[4703]: E0309 13:43:59.618459 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31\": container with ID starting with 3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31 not found: ID does not exist" containerID="3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.619294 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31"} err="failed to get container status \"3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31\": rpc error: code = NotFound desc = could not find container \"3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31\": container with ID starting with 3a2d18d807beedea659a4575bbdd1a0c0696ffe2da749f5e5a97fbaac853ba31 not found: ID does not exist" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.619343 4703 scope.go:117] "RemoveContainer" containerID="2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.624570 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.640754 4703 scope.go:117] "RemoveContainer" containerID="a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.669339 4703 scope.go:117] "RemoveContainer" containerID="2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141" Mar 09 13:43:59 crc kubenswrapper[4703]: E0309 13:43:59.669839 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141\": container with ID starting with 2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141 not found: ID does not exist" containerID="2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.669905 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141"} err="failed to get container status \"2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141\": rpc error: code = NotFound desc = could not find container \"2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141\": container with ID starting with 2b97044bab0635c83fd0d29d0338fe67f30710236bc9958109f5acc645413141 not found: ID does not exist" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.669954 4703 scope.go:117] "RemoveContainer" containerID="a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e" Mar 09 13:43:59 crc kubenswrapper[4703]: E0309 13:43:59.670272 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e\": container with ID starting with a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e not found: ID does not exist" containerID="a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e" Mar 09 13:43:59 crc kubenswrapper[4703]: I0309 13:43:59.670298 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e"} err="failed to get container status \"a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e\": rpc error: code = NotFound desc = could not find container \"a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e\": container with ID starting with a5f95260999482ec510f9b65e55aea2aeefea92edf023395b79d7cc84cc2461e not found: ID does not exist" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.141563 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551064-q2lc8"] Mar 09 13:44:00 crc kubenswrapper[4703]: E0309 13:44:00.142380 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44905e68-3339-46ca-bb60-82a621d35455" containerName="mariadb-account-delete" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142402 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="44905e68-3339-46ca-bb60-82a621d35455" containerName="mariadb-account-delete" Mar 09 13:44:00 crc kubenswrapper[4703]: E0309 13:44:00.142418 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="manila-share" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142428 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="manila-share" Mar 09 13:44:00 crc kubenswrapper[4703]: E0309 13:44:00.142454 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142461 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api" Mar 09 13:44:00 crc kubenswrapper[4703]: E0309 13:44:00.142481 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="probe" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142489 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="probe" Mar 09 13:44:00 crc kubenswrapper[4703]: E0309 13:44:00.142500 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api-log" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142506 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api-log" Mar 09 13:44:00 crc kubenswrapper[4703]: E0309 13:44:00.142515 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="manila-scheduler" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142521 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="manila-scheduler" Mar 09 13:44:00 crc kubenswrapper[4703]: E0309 13:44:00.142532 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="probe" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142537 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="probe" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142814 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142878 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="probe" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142899 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a131c573-116e-49d5-925e-64f99d666b43" containerName="manila-api-log" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142907 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="manila-scheduler" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142925 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" containerName="probe" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142937 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb3c834-6c40-44eb-aef4-7df38d22afc8" containerName="manila-share" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.142951 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="44905e68-3339-46ca-bb60-82a621d35455" containerName="mariadb-account-delete" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.143578 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-q2lc8" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.149514 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.157700 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.158003 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.158370 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-q2lc8"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.195234 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdszz\" (UniqueName: \"kubernetes.io/projected/97219a68-6ef6-4f8e-947d-da434f9931c4-kube-api-access-rdszz\") pod \"auto-csr-approver-29551064-q2lc8\" (UID: \"97219a68-6ef6-4f8e-947d-da434f9931c4\") " pod="openshift-infra/auto-csr-approver-29551064-q2lc8" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.296356 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdszz\" (UniqueName: \"kubernetes.io/projected/97219a68-6ef6-4f8e-947d-da434f9931c4-kube-api-access-rdszz\") pod \"auto-csr-approver-29551064-q2lc8\" (UID: \"97219a68-6ef6-4f8e-947d-da434f9931c4\") " pod="openshift-infra/auto-csr-approver-29551064-q2lc8" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.315606 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdszz\" (UniqueName: \"kubernetes.io/projected/97219a68-6ef6-4f8e-947d-da434f9931c4-kube-api-access-rdszz\") pod \"auto-csr-approver-29551064-q2lc8\" (UID: \"97219a68-6ef6-4f8e-947d-da434f9931c4\") " pod="openshift-infra/auto-csr-approver-29551064-q2lc8" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.471792 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-db-create-f4lks"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.482163 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-q2lc8" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.484224 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-db-create-f4lks"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.512608 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-d652-account-create-update-f4rqg"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.527042 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manilad652-account-delete-d22tz"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.538407 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-d652-account-create-update-f4rqg"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.544867 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manilad652-account-delete-d22tz"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.677094 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-db-create-w7clx"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.678334 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.683620 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.684561 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.686197 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-db-secret" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.690360 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-create-w7clx"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.698277 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9"] Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.701859 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75a999f-5349-47b5-9d88-f746113cf48b-operator-scripts\") pod \"manila-b5f3-account-create-update-w5hc9\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.701946 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmqrj\" (UniqueName: \"kubernetes.io/projected/f75a999f-5349-47b5-9d88-f746113cf48b-kube-api-access-vmqrj\") pod \"manila-b5f3-account-create-update-w5hc9\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.702041 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9286dca-288a-4d5b-9bca-bed8960bf88c-operator-scripts\") pod \"manila-db-create-w7clx\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.702096 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65cj2\" (UniqueName: \"kubernetes.io/projected/e9286dca-288a-4d5b-9bca-bed8960bf88c-kube-api-access-65cj2\") pod \"manila-db-create-w7clx\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.716010 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44905e68-3339-46ca-bb60-82a621d35455" path="/var/lib/kubelet/pods/44905e68-3339-46ca-bb60-82a621d35455/volumes" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.716824 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6412a67b-65cf-4e02-801b-10cc7473496a" path="/var/lib/kubelet/pods/6412a67b-65cf-4e02-801b-10cc7473496a/volumes" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.717350 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85cb844c-e76f-4c27-a213-30185eeffd4f" path="/var/lib/kubelet/pods/85cb844c-e76f-4c27-a213-30185eeffd4f/volumes" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.718240 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a131c573-116e-49d5-925e-64f99d666b43" path="/var/lib/kubelet/pods/a131c573-116e-49d5-925e-64f99d666b43/volumes" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.718801 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7bdd12-c835-4951-ac39-092d7ef30b56" path="/var/lib/kubelet/pods/ae7bdd12-c835-4951-ac39-092d7ef30b56/volumes" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.804698 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmqrj\" (UniqueName: \"kubernetes.io/projected/f75a999f-5349-47b5-9d88-f746113cf48b-kube-api-access-vmqrj\") pod \"manila-b5f3-account-create-update-w5hc9\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.804893 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9286dca-288a-4d5b-9bca-bed8960bf88c-operator-scripts\") pod \"manila-db-create-w7clx\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.804986 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65cj2\" (UniqueName: \"kubernetes.io/projected/e9286dca-288a-4d5b-9bca-bed8960bf88c-kube-api-access-65cj2\") pod \"manila-db-create-w7clx\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.805085 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75a999f-5349-47b5-9d88-f746113cf48b-operator-scripts\") pod \"manila-b5f3-account-create-update-w5hc9\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.805755 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9286dca-288a-4d5b-9bca-bed8960bf88c-operator-scripts\") pod \"manila-db-create-w7clx\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.806198 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75a999f-5349-47b5-9d88-f746113cf48b-operator-scripts\") pod \"manila-b5f3-account-create-update-w5hc9\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.821604 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65cj2\" (UniqueName: \"kubernetes.io/projected/e9286dca-288a-4d5b-9bca-bed8960bf88c-kube-api-access-65cj2\") pod \"manila-db-create-w7clx\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:00 crc kubenswrapper[4703]: I0309 13:44:00.823876 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmqrj\" (UniqueName: \"kubernetes.io/projected/f75a999f-5349-47b5-9d88-f746113cf48b-kube-api-access-vmqrj\") pod \"manila-b5f3-account-create-update-w5hc9\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.003947 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.010403 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.013138 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-q2lc8"] Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.263368 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-create-w7clx"] Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.330545 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9"] Mar 09 13:44:01 crc kubenswrapper[4703]: W0309 13:44:01.340338 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf75a999f_5349_47b5_9d88_f746113cf48b.slice/crio-357f97b74c9209f039e3c2284789697e2130418c19d5778414915fb4ddf7bf94 WatchSource:0}: Error finding container 357f97b74c9209f039e3c2284789697e2130418c19d5778414915fb4ddf7bf94: Status 404 returned error can't find the container with id 357f97b74c9209f039e3c2284789697e2130418c19d5778414915fb4ddf7bf94 Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.573076 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-q2lc8" event={"ID":"97219a68-6ef6-4f8e-947d-da434f9931c4","Type":"ContainerStarted","Data":"c69a9e6ebb0412e30375bb345fabfbb6710052481e0cbbfbbf6def9b2efdffe7"} Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.575524 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" event={"ID":"f75a999f-5349-47b5-9d88-f746113cf48b","Type":"ContainerStarted","Data":"42dc11fd013e2e4632457a5aca0780e3a954bfa4626018fcdf35f7c3fc4eb0d9"} Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.575579 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" event={"ID":"f75a999f-5349-47b5-9d88-f746113cf48b","Type":"ContainerStarted","Data":"357f97b74c9209f039e3c2284789697e2130418c19d5778414915fb4ddf7bf94"} Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.577280 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-w7clx" event={"ID":"e9286dca-288a-4d5b-9bca-bed8960bf88c","Type":"ContainerStarted","Data":"015172b03a8c90128f9abc845c3bb6f95d4504d5db161255f1cf3fae989f39ba"} Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.577303 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-w7clx" event={"ID":"e9286dca-288a-4d5b-9bca-bed8960bf88c","Type":"ContainerStarted","Data":"b25d1f4e4dd93d1554c48ba2d5106bef10b122e600fa1a854470d6936f037cf3"} Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.598138 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" podStartSLOduration=1.598116597 podStartE2EDuration="1.598116597s" podCreationTimestamp="2026-03-09 13:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:01.59454317 +0000 UTC m=+1437.561958856" watchObservedRunningTime="2026-03-09 13:44:01.598116597 +0000 UTC m=+1437.565532283" Mar 09 13:44:01 crc kubenswrapper[4703]: I0309 13:44:01.619634 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-db-create-w7clx" podStartSLOduration=1.619615384 podStartE2EDuration="1.619615384s" podCreationTimestamp="2026-03-09 13:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:01.61726198 +0000 UTC m=+1437.584677686" watchObservedRunningTime="2026-03-09 13:44:01.619615384 +0000 UTC m=+1437.587031070" Mar 09 13:44:02 crc kubenswrapper[4703]: I0309 13:44:02.589319 4703 generic.go:334] "Generic (PLEG): container finished" podID="e9286dca-288a-4d5b-9bca-bed8960bf88c" containerID="015172b03a8c90128f9abc845c3bb6f95d4504d5db161255f1cf3fae989f39ba" exitCode=0 Mar 09 13:44:02 crc kubenswrapper[4703]: I0309 13:44:02.589551 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-w7clx" event={"ID":"e9286dca-288a-4d5b-9bca-bed8960bf88c","Type":"ContainerDied","Data":"015172b03a8c90128f9abc845c3bb6f95d4504d5db161255f1cf3fae989f39ba"} Mar 09 13:44:02 crc kubenswrapper[4703]: I0309 13:44:02.594339 4703 generic.go:334] "Generic (PLEG): container finished" podID="97219a68-6ef6-4f8e-947d-da434f9931c4" containerID="ef95328eab0a37adaa01af3ed1ddc38fa9933e9da2321bf2870f73e720f14640" exitCode=0 Mar 09 13:44:02 crc kubenswrapper[4703]: I0309 13:44:02.594544 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-q2lc8" event={"ID":"97219a68-6ef6-4f8e-947d-da434f9931c4","Type":"ContainerDied","Data":"ef95328eab0a37adaa01af3ed1ddc38fa9933e9da2321bf2870f73e720f14640"} Mar 09 13:44:02 crc kubenswrapper[4703]: I0309 13:44:02.596455 4703 generic.go:334] "Generic (PLEG): container finished" podID="f75a999f-5349-47b5-9d88-f746113cf48b" containerID="42dc11fd013e2e4632457a5aca0780e3a954bfa4626018fcdf35f7c3fc4eb0d9" exitCode=0 Mar 09 13:44:02 crc kubenswrapper[4703]: I0309 13:44:02.596507 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" event={"ID":"f75a999f-5349-47b5-9d88-f746113cf48b","Type":"ContainerDied","Data":"42dc11fd013e2e4632457a5aca0780e3a954bfa4626018fcdf35f7c3fc4eb0d9"} Mar 09 13:44:03 crc kubenswrapper[4703]: I0309 13:44:03.944768 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.007856 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-q2lc8" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.011787 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.062346 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75a999f-5349-47b5-9d88-f746113cf48b-operator-scripts\") pod \"f75a999f-5349-47b5-9d88-f746113cf48b\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.062439 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9286dca-288a-4d5b-9bca-bed8960bf88c-operator-scripts\") pod \"e9286dca-288a-4d5b-9bca-bed8960bf88c\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.062502 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmqrj\" (UniqueName: \"kubernetes.io/projected/f75a999f-5349-47b5-9d88-f746113cf48b-kube-api-access-vmqrj\") pod \"f75a999f-5349-47b5-9d88-f746113cf48b\" (UID: \"f75a999f-5349-47b5-9d88-f746113cf48b\") " Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.062547 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65cj2\" (UniqueName: \"kubernetes.io/projected/e9286dca-288a-4d5b-9bca-bed8960bf88c-kube-api-access-65cj2\") pod \"e9286dca-288a-4d5b-9bca-bed8960bf88c\" (UID: \"e9286dca-288a-4d5b-9bca-bed8960bf88c\") " Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.062577 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdszz\" (UniqueName: \"kubernetes.io/projected/97219a68-6ef6-4f8e-947d-da434f9931c4-kube-api-access-rdszz\") pod \"97219a68-6ef6-4f8e-947d-da434f9931c4\" (UID: \"97219a68-6ef6-4f8e-947d-da434f9931c4\") " Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.068151 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9286dca-288a-4d5b-9bca-bed8960bf88c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9286dca-288a-4d5b-9bca-bed8960bf88c" (UID: "e9286dca-288a-4d5b-9bca-bed8960bf88c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.068739 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f75a999f-5349-47b5-9d88-f746113cf48b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f75a999f-5349-47b5-9d88-f746113cf48b" (UID: "f75a999f-5349-47b5-9d88-f746113cf48b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.069366 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97219a68-6ef6-4f8e-947d-da434f9931c4-kube-api-access-rdszz" (OuterVolumeSpecName: "kube-api-access-rdszz") pod "97219a68-6ef6-4f8e-947d-da434f9931c4" (UID: "97219a68-6ef6-4f8e-947d-da434f9931c4"). InnerVolumeSpecName "kube-api-access-rdszz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.081182 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f75a999f-5349-47b5-9d88-f746113cf48b-kube-api-access-vmqrj" (OuterVolumeSpecName: "kube-api-access-vmqrj") pod "f75a999f-5349-47b5-9d88-f746113cf48b" (UID: "f75a999f-5349-47b5-9d88-f746113cf48b"). InnerVolumeSpecName "kube-api-access-vmqrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.081240 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9286dca-288a-4d5b-9bca-bed8960bf88c-kube-api-access-65cj2" (OuterVolumeSpecName: "kube-api-access-65cj2") pod "e9286dca-288a-4d5b-9bca-bed8960bf88c" (UID: "e9286dca-288a-4d5b-9bca-bed8960bf88c"). InnerVolumeSpecName "kube-api-access-65cj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.164411 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmqrj\" (UniqueName: \"kubernetes.io/projected/f75a999f-5349-47b5-9d88-f746113cf48b-kube-api-access-vmqrj\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.164460 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65cj2\" (UniqueName: \"kubernetes.io/projected/e9286dca-288a-4d5b-9bca-bed8960bf88c-kube-api-access-65cj2\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.164473 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdszz\" (UniqueName: \"kubernetes.io/projected/97219a68-6ef6-4f8e-947d-da434f9931c4-kube-api-access-rdszz\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.164485 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75a999f-5349-47b5-9d88-f746113cf48b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.164497 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9286dca-288a-4d5b-9bca-bed8960bf88c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.612247 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" event={"ID":"f75a999f-5349-47b5-9d88-f746113cf48b","Type":"ContainerDied","Data":"357f97b74c9209f039e3c2284789697e2130418c19d5778414915fb4ddf7bf94"} Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.612297 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="357f97b74c9209f039e3c2284789697e2130418c19d5778414915fb4ddf7bf94" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.612269 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.613660 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-create-w7clx" event={"ID":"e9286dca-288a-4d5b-9bca-bed8960bf88c","Type":"ContainerDied","Data":"b25d1f4e4dd93d1554c48ba2d5106bef10b122e600fa1a854470d6936f037cf3"} Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.613683 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b25d1f4e4dd93d1554c48ba2d5106bef10b122e600fa1a854470d6936f037cf3" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.613716 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-create-w7clx" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.628287 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551064-q2lc8" event={"ID":"97219a68-6ef6-4f8e-947d-da434f9931c4","Type":"ContainerDied","Data":"c69a9e6ebb0412e30375bb345fabfbb6710052481e0cbbfbbf6def9b2efdffe7"} Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.628452 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c69a9e6ebb0412e30375bb345fabfbb6710052481e0cbbfbbf6def9b2efdffe7" Mar 09 13:44:04 crc kubenswrapper[4703]: I0309 13:44:04.628320 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551064-q2lc8" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.085085 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-s5rq7"] Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.091542 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551058-s5rq7"] Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.924651 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-db-sync-282j2"] Mar 09 13:44:05 crc kubenswrapper[4703]: E0309 13:44:05.925255 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9286dca-288a-4d5b-9bca-bed8960bf88c" containerName="mariadb-database-create" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.925370 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9286dca-288a-4d5b-9bca-bed8960bf88c" containerName="mariadb-database-create" Mar 09 13:44:05 crc kubenswrapper[4703]: E0309 13:44:05.925493 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75a999f-5349-47b5-9d88-f746113cf48b" containerName="mariadb-account-create-update" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.925586 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75a999f-5349-47b5-9d88-f746113cf48b" containerName="mariadb-account-create-update" Mar 09 13:44:05 crc kubenswrapper[4703]: E0309 13:44:05.925683 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97219a68-6ef6-4f8e-947d-da434f9931c4" containerName="oc" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.925822 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="97219a68-6ef6-4f8e-947d-da434f9931c4" containerName="oc" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.926119 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="97219a68-6ef6-4f8e-947d-da434f9931c4" containerName="oc" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.926233 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9286dca-288a-4d5b-9bca-bed8960bf88c" containerName="mariadb-database-create" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.926330 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75a999f-5349-47b5-9d88-f746113cf48b" containerName="mariadb-account-create-update" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.927088 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.930569 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"combined-ca-bundle" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.930659 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-manila-dockercfg-kjm9l" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.940651 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-sync-282j2"] Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.992933 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhvjf\" (UniqueName: \"kubernetes.io/projected/34cbed37-3818-449d-8781-fa7451928248-kube-api-access-fhvjf\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.992980 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-config-data\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.992998 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-job-config-data\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:05 crc kubenswrapper[4703]: I0309 13:44:05.993019 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-combined-ca-bundle\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.094054 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-job-config-data\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.094099 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-combined-ca-bundle\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.094197 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhvjf\" (UniqueName: \"kubernetes.io/projected/34cbed37-3818-449d-8781-fa7451928248-kube-api-access-fhvjf\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.094218 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-config-data\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.099409 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-combined-ca-bundle\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.099866 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-config-data\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.109231 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-job-config-data\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.111530 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhvjf\" (UniqueName: \"kubernetes.io/projected/34cbed37-3818-449d-8781-fa7451928248-kube-api-access-fhvjf\") pod \"manila-db-sync-282j2\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.243620 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.662161 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-db-sync-282j2"] Mar 09 13:44:06 crc kubenswrapper[4703]: W0309 13:44:06.666661 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34cbed37_3818_449d_8781_fa7451928248.slice/crio-e87c1bb7823e40685bc8ced21d18ff4aaf06c0d8e5cdd0634c8c406396863269 WatchSource:0}: Error finding container e87c1bb7823e40685bc8ced21d18ff4aaf06c0d8e5cdd0634c8c406396863269: Status 404 returned error can't find the container with id e87c1bb7823e40685bc8ced21d18ff4aaf06c0d8e5cdd0634c8c406396863269 Mar 09 13:44:06 crc kubenswrapper[4703]: I0309 13:44:06.723274 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a518975f-f324-4119-bb77-6f308f0e8731" path="/var/lib/kubelet/pods/a518975f-f324-4119-bb77-6f308f0e8731/volumes" Mar 09 13:44:07 crc kubenswrapper[4703]: I0309 13:44:07.666286 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-282j2" event={"ID":"34cbed37-3818-449d-8781-fa7451928248","Type":"ContainerStarted","Data":"8b0e34fef625e494409c110885c752bcc083276cf09c6658313db16b5966e6c3"} Mar 09 13:44:07 crc kubenswrapper[4703]: I0309 13:44:07.666586 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-282j2" event={"ID":"34cbed37-3818-449d-8781-fa7451928248","Type":"ContainerStarted","Data":"e87c1bb7823e40685bc8ced21d18ff4aaf06c0d8e5cdd0634c8c406396863269"} Mar 09 13:44:07 crc kubenswrapper[4703]: I0309 13:44:07.698332 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-db-sync-282j2" podStartSLOduration=2.698306318 podStartE2EDuration="2.698306318s" podCreationTimestamp="2026-03-09 13:44:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:07.693366563 +0000 UTC m=+1443.660782259" watchObservedRunningTime="2026-03-09 13:44:07.698306318 +0000 UTC m=+1443.665722004" Mar 09 13:44:08 crc kubenswrapper[4703]: I0309 13:44:08.677799 4703 generic.go:334] "Generic (PLEG): container finished" podID="34cbed37-3818-449d-8781-fa7451928248" containerID="8b0e34fef625e494409c110885c752bcc083276cf09c6658313db16b5966e6c3" exitCode=0 Mar 09 13:44:08 crc kubenswrapper[4703]: I0309 13:44:08.677922 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-282j2" event={"ID":"34cbed37-3818-449d-8781-fa7451928248","Type":"ContainerDied","Data":"8b0e34fef625e494409c110885c752bcc083276cf09c6658313db16b5966e6c3"} Mar 09 13:44:09 crc kubenswrapper[4703]: I0309 13:44:09.989495 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.059927 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-job-config-data\") pod \"34cbed37-3818-449d-8781-fa7451928248\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.060078 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhvjf\" (UniqueName: \"kubernetes.io/projected/34cbed37-3818-449d-8781-fa7451928248-kube-api-access-fhvjf\") pod \"34cbed37-3818-449d-8781-fa7451928248\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.060170 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-combined-ca-bundle\") pod \"34cbed37-3818-449d-8781-fa7451928248\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.060763 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-config-data\") pod \"34cbed37-3818-449d-8781-fa7451928248\" (UID: \"34cbed37-3818-449d-8781-fa7451928248\") " Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.064959 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "34cbed37-3818-449d-8781-fa7451928248" (UID: "34cbed37-3818-449d-8781-fa7451928248"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.065481 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34cbed37-3818-449d-8781-fa7451928248-kube-api-access-fhvjf" (OuterVolumeSpecName: "kube-api-access-fhvjf") pod "34cbed37-3818-449d-8781-fa7451928248" (UID: "34cbed37-3818-449d-8781-fa7451928248"). InnerVolumeSpecName "kube-api-access-fhvjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.067023 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-config-data" (OuterVolumeSpecName: "config-data") pod "34cbed37-3818-449d-8781-fa7451928248" (UID: "34cbed37-3818-449d-8781-fa7451928248"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.086084 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34cbed37-3818-449d-8781-fa7451928248" (UID: "34cbed37-3818-449d-8781-fa7451928248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.161924 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhvjf\" (UniqueName: \"kubernetes.io/projected/34cbed37-3818-449d-8781-fa7451928248-kube-api-access-fhvjf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.162236 4703 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.162250 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.162261 4703 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/34cbed37-3818-449d-8781-fa7451928248-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.696218 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-db-sync-282j2" event={"ID":"34cbed37-3818-449d-8781-fa7451928248","Type":"ContainerDied","Data":"e87c1bb7823e40685bc8ced21d18ff4aaf06c0d8e5cdd0634c8c406396863269"} Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.696262 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87c1bb7823e40685bc8ced21d18ff4aaf06c0d8e5cdd0634c8c406396863269" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.696326 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-db-sync-282j2" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.955738 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:44:10 crc kubenswrapper[4703]: E0309 13:44:10.956184 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34cbed37-3818-449d-8781-fa7451928248" containerName="manila-db-sync" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.956214 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cbed37-3818-449d-8781-fa7451928248" containerName="manila-db-sync" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.956447 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="34cbed37-3818-449d-8781-fa7451928248" containerName="manila-db-sync" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.957624 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.960438 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-scheduler-config-data" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.960449 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-scripts" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.961788 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-manila-dockercfg-kjm9l" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.961799 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"combined-ca-bundle" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.969321 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.978725 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.980094 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.984207 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.985867 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"ceph-conf-files" Mar 09 13:44:10 crc kubenswrapper[4703]: I0309 13:44:10.986110 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-share-share0-config-data" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.080743 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.080809 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.080837 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7v2w\" (UniqueName: \"kubernetes.io/projected/56622617-929d-4188-9de5-94c37d8aac41-kube-api-access-m7v2w\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.080974 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-ceph\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081081 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56622617-929d-4188-9de5-94c37d8aac41-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081130 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-scripts\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081157 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081230 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081255 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-scripts\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081291 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081351 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081370 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2stth\" (UniqueName: \"kubernetes.io/projected/ba685b9e-70a5-4243-aa32-700e4d0f8f89-kube-api-access-2stth\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081409 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-combined-ca-bundle\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.081494 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.126240 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.127579 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.133880 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"cert-manila-internal-svc" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.134088 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"manila-api-config-data" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.134201 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"cert-manila-public-svc" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.153695 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.182956 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183050 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9gq\" (UniqueName: \"kubernetes.io/projected/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-kube-api-access-nm9gq\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183103 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183138 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183181 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-etc-machine-id\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183183 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-etc-machine-id\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183233 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7v2w\" (UniqueName: \"kubernetes.io/projected/56622617-929d-4188-9de5-94c37d8aac41-kube-api-access-m7v2w\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183253 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-ceph\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183271 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183289 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-logs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183309 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183329 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56622617-929d-4188-9de5-94c37d8aac41-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183345 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-scripts\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183359 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-public-tls-certs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183370 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56622617-929d-4188-9de5-94c37d8aac41-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183381 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183427 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183452 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-scripts\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183479 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-scripts\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183499 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183534 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183554 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2stth\" (UniqueName: \"kubernetes.io/projected/ba685b9e-70a5-4243-aa32-700e4d0f8f89-kube-api-access-2stth\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183580 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-combined-ca-bundle\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183614 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183638 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data-custom\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.183813 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-var-lib-manila\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.208921 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.209391 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2stth\" (UniqueName: \"kubernetes.io/projected/ba685b9e-70a5-4243-aa32-700e4d0f8f89-kube-api-access-2stth\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.211629 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.213360 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.214323 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data-custom\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.216363 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-combined-ca-bundle\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.216908 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-ceph\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.217385 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-scripts\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.217447 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.218140 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-scripts\") pod \"manila-share-share0-0\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.242675 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7v2w\" (UniqueName: \"kubernetes.io/projected/56622617-929d-4188-9de5-94c37d8aac41-kube-api-access-m7v2w\") pod \"manila-scheduler-0\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.276031 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.284739 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285030 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data-custom\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285068 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9gq\" (UniqueName: \"kubernetes.io/projected/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-kube-api-access-nm9gq\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285093 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-etc-machine-id\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285115 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285132 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-logs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285152 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285169 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-public-tls-certs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.285195 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-scripts\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.289441 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-scripts\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.290057 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-logs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.290167 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-etc-machine-id\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.290579 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.291070 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-internal-tls-certs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.296371 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.296572 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-public-tls-certs\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.298434 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data-custom\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.304529 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9gq\" (UniqueName: \"kubernetes.io/projected/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-kube-api-access-nm9gq\") pod \"manila-api-0\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.304828 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.445114 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.726250 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:44:11 crc kubenswrapper[4703]: W0309 13:44:11.729441 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56622617_929d_4188_9de5_94c37d8aac41.slice/crio-50ee77c47b5aaaae1e3d1ffdc83dea61637ae03bc8140b8f82ee3095a558e3c0 WatchSource:0}: Error finding container 50ee77c47b5aaaae1e3d1ffdc83dea61637ae03bc8140b8f82ee3095a558e3c0: Status 404 returned error can't find the container with id 50ee77c47b5aaaae1e3d1ffdc83dea61637ae03bc8140b8f82ee3095a558e3c0 Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.766725 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:44:11 crc kubenswrapper[4703]: I0309 13:44:11.880454 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:44:11 crc kubenswrapper[4703]: W0309 13:44:11.886076 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c3886f6_ff37_4519_9eaa_ae33f1121b7a.slice/crio-d0fc91fb816ed8c4a29f082ae40b82ba02bb7ecd6bbe6952e8e4c57c9be3f1b5 WatchSource:0}: Error finding container d0fc91fb816ed8c4a29f082ae40b82ba02bb7ecd6bbe6952e8e4c57c9be3f1b5: Status 404 returned error can't find the container with id d0fc91fb816ed8c4a29f082ae40b82ba02bb7ecd6bbe6952e8e4c57c9be3f1b5 Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.737379 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"56622617-929d-4188-9de5-94c37d8aac41","Type":"ContainerStarted","Data":"64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.738094 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"56622617-929d-4188-9de5-94c37d8aac41","Type":"ContainerStarted","Data":"b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.738107 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"56622617-929d-4188-9de5-94c37d8aac41","Type":"ContainerStarted","Data":"50ee77c47b5aaaae1e3d1ffdc83dea61637ae03bc8140b8f82ee3095a558e3c0"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.740290 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"5c3886f6-ff37-4519-9eaa-ae33f1121b7a","Type":"ContainerStarted","Data":"6de89c8a16bf75a072a23cbade4d85ccbe06e2a461bc93856149a84ac97f36bd"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.740316 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"5c3886f6-ff37-4519-9eaa-ae33f1121b7a","Type":"ContainerStarted","Data":"d0fc91fb816ed8c4a29f082ae40b82ba02bb7ecd6bbe6952e8e4c57c9be3f1b5"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.743763 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"ba685b9e-70a5-4243-aa32-700e4d0f8f89","Type":"ContainerStarted","Data":"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.743792 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"ba685b9e-70a5-4243-aa32-700e4d0f8f89","Type":"ContainerStarted","Data":"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.743805 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"ba685b9e-70a5-4243-aa32-700e4d0f8f89","Type":"ContainerStarted","Data":"b7c1bbbc82b9020008914f10503210ea1eb78269f27efea2779dfacb7ef436d5"} Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.767526 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-scheduler-0" podStartSLOduration=2.767507191 podStartE2EDuration="2.767507191s" podCreationTimestamp="2026-03-09 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:12.760623404 +0000 UTC m=+1448.728039100" watchObservedRunningTime="2026-03-09 13:44:12.767507191 +0000 UTC m=+1448.734922897" Mar 09 13:44:12 crc kubenswrapper[4703]: I0309 13:44:12.786382 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-share-share0-0" podStartSLOduration=2.786365876 podStartE2EDuration="2.786365876s" podCreationTimestamp="2026-03-09 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:12.783361274 +0000 UTC m=+1448.750776960" watchObservedRunningTime="2026-03-09 13:44:12.786365876 +0000 UTC m=+1448.753781562" Mar 09 13:44:13 crc kubenswrapper[4703]: I0309 13:44:13.756604 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"5c3886f6-ff37-4519-9eaa-ae33f1121b7a","Type":"ContainerStarted","Data":"25ef4f6013eeab6e741bbea4a470a5a87a69f29ae644316453a11829bd5c060b"} Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.328036 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manila-api-0" podStartSLOduration=3.328016092 podStartE2EDuration="3.328016092s" podCreationTimestamp="2026-03-09 13:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:13.789277809 +0000 UTC m=+1449.756693525" watchObservedRunningTime="2026-03-09 13:44:14.328016092 +0000 UTC m=+1450.295431778" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.332584 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7t6xp"] Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.333995 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.342206 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7t6xp"] Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.430691 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-utilities\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.430760 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klg8d\" (UniqueName: \"kubernetes.io/projected/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-kube-api-access-klg8d\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.430881 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-catalog-content\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.531987 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-utilities\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.532346 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klg8d\" (UniqueName: \"kubernetes.io/projected/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-kube-api-access-klg8d\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.532398 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-catalog-content\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.532450 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-utilities\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.532769 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-catalog-content\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.559783 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klg8d\" (UniqueName: \"kubernetes.io/projected/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-kube-api-access-klg8d\") pod \"certified-operators-7t6xp\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.650311 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:14 crc kubenswrapper[4703]: I0309 13:44:14.767337 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:15 crc kubenswrapper[4703]: I0309 13:44:15.123166 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7t6xp"] Mar 09 13:44:15 crc kubenswrapper[4703]: I0309 13:44:15.781654 4703 generic.go:334] "Generic (PLEG): container finished" podID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerID="3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1" exitCode=0 Mar 09 13:44:15 crc kubenswrapper[4703]: I0309 13:44:15.781829 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t6xp" event={"ID":"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3","Type":"ContainerDied","Data":"3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1"} Mar 09 13:44:15 crc kubenswrapper[4703]: I0309 13:44:15.782018 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t6xp" event={"ID":"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3","Type":"ContainerStarted","Data":"d8eb521df0bc69423dfce6f0accb98974437d1ef210dc2568096188691d91ad4"} Mar 09 13:44:17 crc kubenswrapper[4703]: I0309 13:44:17.796955 4703 generic.go:334] "Generic (PLEG): container finished" podID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerID="e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e" exitCode=0 Mar 09 13:44:17 crc kubenswrapper[4703]: I0309 13:44:17.797064 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t6xp" event={"ID":"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3","Type":"ContainerDied","Data":"e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e"} Mar 09 13:44:18 crc kubenswrapper[4703]: I0309 13:44:18.806991 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t6xp" event={"ID":"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3","Type":"ContainerStarted","Data":"62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4"} Mar 09 13:44:18 crc kubenswrapper[4703]: I0309 13:44:18.834487 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7t6xp" podStartSLOduration=2.40101262 podStartE2EDuration="4.834469566s" podCreationTimestamp="2026-03-09 13:44:14 +0000 UTC" firstStartedPulling="2026-03-09 13:44:15.784589836 +0000 UTC m=+1451.752005522" lastFinishedPulling="2026-03-09 13:44:18.218046772 +0000 UTC m=+1454.185462468" observedRunningTime="2026-03-09 13:44:18.827266399 +0000 UTC m=+1454.794682085" watchObservedRunningTime="2026-03-09 13:44:18.834469566 +0000 UTC m=+1454.801885252" Mar 09 13:44:21 crc kubenswrapper[4703]: I0309 13:44:21.276663 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:21 crc kubenswrapper[4703]: I0309 13:44:21.306230 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:24 crc kubenswrapper[4703]: I0309 13:44:24.651356 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:24 crc kubenswrapper[4703]: I0309 13:44:24.651779 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:24 crc kubenswrapper[4703]: I0309 13:44:24.722238 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:24 crc kubenswrapper[4703]: I0309 13:44:24.889344 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:24 crc kubenswrapper[4703]: I0309 13:44:24.960447 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7t6xp"] Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.837605 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.866564 4703 generic.go:334] "Generic (PLEG): container finished" podID="75236b41-aeed-410a-8afd-f3a360b158d6" containerID="0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583" exitCode=137 Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.866812 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7t6xp" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="registry-server" containerID="cri-o://62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4" gracePeriod=2 Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.867186 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.867487 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" event={"ID":"75236b41-aeed-410a-8afd-f3a360b158d6","Type":"ContainerDied","Data":"0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583"} Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.867520 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg" event={"ID":"75236b41-aeed-410a-8afd-f3a360b158d6","Type":"ContainerDied","Data":"1d5a468bbef9a494213fbbd85602c17fb628d1a526cf689cce42b1920d39fa8e"} Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.867539 4703 scope.go:117] "RemoveContainer" containerID="0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.901807 4703 scope.go:117] "RemoveContainer" containerID="0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583" Mar 09 13:44:26 crc kubenswrapper[4703]: E0309 13:44:26.902396 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583\": container with ID starting with 0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583 not found: ID does not exist" containerID="0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.902427 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583"} err="failed to get container status \"0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583\": rpc error: code = NotFound desc = could not find container \"0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583\": container with ID starting with 0ee71e6952cb1b49915baf19088447877cada46f12c09b4f2a6a9270e58b1583 not found: ID does not exist" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.904465 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-config-data\") pod \"75236b41-aeed-410a-8afd-f3a360b158d6\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.904513 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhskg\" (UniqueName: \"kubernetes.io/projected/75236b41-aeed-410a-8afd-f3a360b158d6-kube-api-access-mhskg\") pod \"75236b41-aeed-410a-8afd-f3a360b158d6\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.904714 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-job-config-data\") pod \"75236b41-aeed-410a-8afd-f3a360b158d6\" (UID: \"75236b41-aeed-410a-8afd-f3a360b158d6\") " Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.911739 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "75236b41-aeed-410a-8afd-f3a360b158d6" (UID: "75236b41-aeed-410a-8afd-f3a360b158d6"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.912411 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75236b41-aeed-410a-8afd-f3a360b158d6-kube-api-access-mhskg" (OuterVolumeSpecName: "kube-api-access-mhskg") pod "75236b41-aeed-410a-8afd-f3a360b158d6" (UID: "75236b41-aeed-410a-8afd-f3a360b158d6"). InnerVolumeSpecName "kube-api-access-mhskg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.915037 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-config-data" (OuterVolumeSpecName: "config-data") pod "75236b41-aeed-410a-8afd-f3a360b158d6" (UID: "75236b41-aeed-410a-8afd-f3a360b158d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:26 crc kubenswrapper[4703]: I0309 13:44:26.989442 4703 scope.go:117] "RemoveContainer" containerID="f73e451c2a24c80f3ff6825370999dbc9d8476b46cf261308d48d0311aee347a" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.005412 4703 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.005442 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75236b41-aeed-410a-8afd-f3a360b158d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.005453 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhskg\" (UniqueName: \"kubernetes.io/projected/75236b41-aeed-410a-8afd-f3a360b158d6-kube-api-access-mhskg\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.206378 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg"] Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.212938 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-service-cleanup-n5b5h655-dm5zg"] Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.309458 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.513811 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-catalog-content\") pod \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.514017 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klg8d\" (UniqueName: \"kubernetes.io/projected/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-kube-api-access-klg8d\") pod \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.514180 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-utilities\") pod \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\" (UID: \"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3\") " Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.517891 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-utilities" (OuterVolumeSpecName: "utilities") pod "0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" (UID: "0e82c5fc-7fff-434f-b3a8-936e8a09d5b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.519552 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-kube-api-access-klg8d" (OuterVolumeSpecName: "kube-api-access-klg8d") pod "0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" (UID: "0e82c5fc-7fff-434f-b3a8-936e8a09d5b3"). InnerVolumeSpecName "kube-api-access-klg8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.583778 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" (UID: "0e82c5fc-7fff-434f-b3a8-936e8a09d5b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.616162 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.616206 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.616222 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klg8d\" (UniqueName: \"kubernetes.io/projected/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3-kube-api-access-klg8d\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.874906 4703 generic.go:334] "Generic (PLEG): container finished" podID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerID="62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4" exitCode=0 Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.874974 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t6xp" event={"ID":"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3","Type":"ContainerDied","Data":"62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4"} Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.875004 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7t6xp" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.875029 4703 scope.go:117] "RemoveContainer" containerID="62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.875013 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7t6xp" event={"ID":"0e82c5fc-7fff-434f-b3a8-936e8a09d5b3","Type":"ContainerDied","Data":"d8eb521df0bc69423dfce6f0accb98974437d1ef210dc2568096188691d91ad4"} Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.896875 4703 scope.go:117] "RemoveContainer" containerID="e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.925463 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7t6xp"] Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.940222 4703 scope.go:117] "RemoveContainer" containerID="3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.948995 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7t6xp"] Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.962349 4703 scope.go:117] "RemoveContainer" containerID="62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4" Mar 09 13:44:27 crc kubenswrapper[4703]: E0309 13:44:27.963209 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4\": container with ID starting with 62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4 not found: ID does not exist" containerID="62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.963266 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4"} err="failed to get container status \"62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4\": rpc error: code = NotFound desc = could not find container \"62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4\": container with ID starting with 62d5b30742a96c8e9be5c08d35ffaedae948ed18560afa249f9a12e2b2eee5b4 not found: ID does not exist" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.963296 4703 scope.go:117] "RemoveContainer" containerID="e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e" Mar 09 13:44:27 crc kubenswrapper[4703]: E0309 13:44:27.963866 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e\": container with ID starting with e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e not found: ID does not exist" containerID="e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.963927 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e"} err="failed to get container status \"e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e\": rpc error: code = NotFound desc = could not find container \"e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e\": container with ID starting with e253a621bd2e317e62eb5512d9b5723177b51749ddee1882893dd647d6dec14e not found: ID does not exist" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.963958 4703 scope.go:117] "RemoveContainer" containerID="3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1" Mar 09 13:44:27 crc kubenswrapper[4703]: E0309 13:44:27.964304 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1\": container with ID starting with 3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1 not found: ID does not exist" containerID="3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1" Mar 09 13:44:27 crc kubenswrapper[4703]: I0309 13:44:27.964335 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1"} err="failed to get container status \"3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1\": rpc error: code = NotFound desc = could not find container \"3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1\": container with ID starting with 3a2ff64693211788a6f29eebcf4639bec986e01c797ca976dfa334e2004e5de1 not found: ID does not exist" Mar 09 13:44:28 crc kubenswrapper[4703]: I0309 13:44:28.724237 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" path="/var/lib/kubelet/pods/0e82c5fc-7fff-434f-b3a8-936e8a09d5b3/volumes" Mar 09 13:44:28 crc kubenswrapper[4703]: I0309 13:44:28.726052 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75236b41-aeed-410a-8afd-f3a360b158d6" path="/var/lib/kubelet/pods/75236b41-aeed-410a-8afd-f3a360b158d6/volumes" Mar 09 13:44:32 crc kubenswrapper[4703]: I0309 13:44:32.786443 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:32 crc kubenswrapper[4703]: I0309 13:44:32.794413 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:32 crc kubenswrapper[4703]: I0309 13:44:32.826956 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.687117 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-db-sync-282j2"] Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.692643 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-db-sync-282j2"] Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.745209 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.745422 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-0" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="manila-scheduler" containerID="cri-o://b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98" gracePeriod=30 Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.745544 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-scheduler-0" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="probe" containerID="cri-o://64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f" gracePeriod=30 Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.772453 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.772724 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share0-0" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="manila-share" containerID="cri-o://dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f" gracePeriod=30 Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.772813 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-share-share0-0" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="probe" containerID="cri-o://68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f" gracePeriod=30 Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.783792 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.784357 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-0" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api-log" containerID="cri-o://6de89c8a16bf75a072a23cbade4d85ccbe06e2a461bc93856149a84ac97f36bd" gracePeriod=30 Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.784530 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/manila-api-0" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api" containerID="cri-o://25ef4f6013eeab6e741bbea4a470a5a87a69f29ae644316453a11829bd5c060b" gracePeriod=30 Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.808882 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/manilab5f3-account-delete-k8sl9"] Mar 09 13:44:33 crc kubenswrapper[4703]: E0309 13:44:33.809234 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="extract-content" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.809250 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="extract-content" Mar 09 13:44:33 crc kubenswrapper[4703]: E0309 13:44:33.809274 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="extract-utilities" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.809282 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="extract-utilities" Mar 09 13:44:33 crc kubenswrapper[4703]: E0309 13:44:33.809297 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="registry-server" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.809305 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="registry-server" Mar 09 13:44:33 crc kubenswrapper[4703]: E0309 13:44:33.809323 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75236b41-aeed-410a-8afd-f3a360b158d6" containerName="manila-service-cleanup-n5b5h655" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.809332 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="75236b41-aeed-410a-8afd-f3a360b158d6" containerName="manila-service-cleanup-n5b5h655" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.809477 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e82c5fc-7fff-434f-b3a8-936e8a09d5b3" containerName="registry-server" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.809505 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="75236b41-aeed-410a-8afd-f3a360b158d6" containerName="manila-service-cleanup-n5b5h655" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.810108 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.824774 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manilab5f3-account-delete-k8sl9"] Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.922869 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5f7j\" (UniqueName: \"kubernetes.io/projected/3b459b52-8877-4a75-a737-2f4c0bec10cf-kube-api-access-x5f7j\") pod \"manilab5f3-account-delete-k8sl9\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.922970 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b459b52-8877-4a75-a737-2f4c0bec10cf-operator-scripts\") pod \"manilab5f3-account-delete-k8sl9\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.933744 4703 generic.go:334] "Generic (PLEG): container finished" podID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerID="6de89c8a16bf75a072a23cbade4d85ccbe06e2a461bc93856149a84ac97f36bd" exitCode=143 Mar 09 13:44:33 crc kubenswrapper[4703]: I0309 13:44:33.933783 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"5c3886f6-ff37-4519-9eaa-ae33f1121b7a","Type":"ContainerDied","Data":"6de89c8a16bf75a072a23cbade4d85ccbe06e2a461bc93856149a84ac97f36bd"} Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.024216 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b459b52-8877-4a75-a737-2f4c0bec10cf-operator-scripts\") pod \"manilab5f3-account-delete-k8sl9\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.024328 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5f7j\" (UniqueName: \"kubernetes.io/projected/3b459b52-8877-4a75-a737-2f4c0bec10cf-kube-api-access-x5f7j\") pod \"manilab5f3-account-delete-k8sl9\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.025159 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b459b52-8877-4a75-a737-2f4c0bec10cf-operator-scripts\") pod \"manilab5f3-account-delete-k8sl9\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.048189 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5f7j\" (UniqueName: \"kubernetes.io/projected/3b459b52-8877-4a75-a737-2f4c0bec10cf-kube-api-access-x5f7j\") pod \"manilab5f3-account-delete-k8sl9\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.123723 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:34 crc kubenswrapper[4703]: W0309 13:44:34.562902 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b459b52_8877_4a75_a737_2f4c0bec10cf.slice/crio-360cbb30af21f58add34b3a0ea36d27e8dd8a0e77aee4916f87c32c608964272 WatchSource:0}: Error finding container 360cbb30af21f58add34b3a0ea36d27e8dd8a0e77aee4916f87c32c608964272: Status 404 returned error can't find the container with id 360cbb30af21f58add34b3a0ea36d27e8dd8a0e77aee4916f87c32c608964272 Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.563278 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/manilab5f3-account-delete-k8sl9"] Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.582475 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.633532 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf"] Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.633748 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" podUID="9b93aff6-9e4d-446c-9882-b572d92049db" containerName="manager" containerID="cri-o://082ebc0ed6faefa2e0d09e8c2a5d615882f29741f9d792d2981831bff9706549" gracePeriod=10 Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.718791 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34cbed37-3818-449d-8781-fa7451928248" path="/var/lib/kubelet/pods/34cbed37-3818-449d-8781-fa7451928248/volumes" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732501 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-combined-ca-bundle\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732554 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data-custom\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732599 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-ceph\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732626 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-var-lib-manila\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732710 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2stth\" (UniqueName: \"kubernetes.io/projected/ba685b9e-70a5-4243-aa32-700e4d0f8f89-kube-api-access-2stth\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732812 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-etc-machine-id\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732854 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-scripts\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.732887 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data\") pod \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\" (UID: \"ba685b9e-70a5-4243-aa32-700e4d0f8f89\") " Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.734042 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.734899 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.741187 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-ceph" (OuterVolumeSpecName: "ceph") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.741988 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.744549 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-scripts" (OuterVolumeSpecName: "scripts") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.748702 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba685b9e-70a5-4243-aa32-700e4d0f8f89-kube-api-access-2stth" (OuterVolumeSpecName: "kube-api-access-2stth") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "kube-api-access-2stth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.815443 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.830255 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data" (OuterVolumeSpecName: "config-data") pod "ba685b9e-70a5-4243-aa32-700e4d0f8f89" (UID: "ba685b9e-70a5-4243-aa32-700e4d0f8f89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834183 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834201 4703 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834213 4703 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834222 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2stth\" (UniqueName: \"kubernetes.io/projected/ba685b9e-70a5-4243-aa32-700e4d0f8f89-kube-api-access-2stth\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834233 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba685b9e-70a5-4243-aa32-700e4d0f8f89-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834240 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834248 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.834256 4703 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba685b9e-70a5-4243-aa32-700e4d0f8f89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.836505 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-index-nldr4"] Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.836760 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/manila-operator-index-nldr4" podUID="029a9f28-5caa-45af-872a-59de8aa8cdbe" containerName="registry-server" containerID="cri-o://8231820edd08e43e7a8b081acd5f16398c424a1f1e8cc3f009892d03b5089780" gracePeriod=30 Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.873629 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b"] Mar 09 13:44:34 crc kubenswrapper[4703]: I0309 13:44:34.877808 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/23feca247cd2fdd8e9fc261728a437cc6bb0e4a359c152f8b6ae634aa6thz9b"] Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.002947 4703 generic.go:334] "Generic (PLEG): container finished" podID="029a9f28-5caa-45af-872a-59de8aa8cdbe" containerID="8231820edd08e43e7a8b081acd5f16398c424a1f1e8cc3f009892d03b5089780" exitCode=0 Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.003182 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-nldr4" event={"ID":"029a9f28-5caa-45af-872a-59de8aa8cdbe","Type":"ContainerDied","Data":"8231820edd08e43e7a8b081acd5f16398c424a1f1e8cc3f009892d03b5089780"} Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.029732 4703 generic.go:334] "Generic (PLEG): container finished" podID="9b93aff6-9e4d-446c-9882-b572d92049db" containerID="082ebc0ed6faefa2e0d09e8c2a5d615882f29741f9d792d2981831bff9706549" exitCode=0 Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.029820 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" event={"ID":"9b93aff6-9e4d-446c-9882-b572d92049db","Type":"ContainerDied","Data":"082ebc0ed6faefa2e0d09e8c2a5d615882f29741f9d792d2981831bff9706549"} Mar 09 13:44:35 crc kubenswrapper[4703]: E0309 13:44:35.041803 4703 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029a9f28_5caa_45af_872a_59de8aa8cdbe.slice/crio-conmon-8231820edd08e43e7a8b081acd5f16398c424a1f1e8cc3f009892d03b5089780.scope\": RecentStats: unable to find data in memory cache]" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.057090 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" event={"ID":"3b459b52-8877-4a75-a737-2f4c0bec10cf","Type":"ContainerStarted","Data":"991fe9f417eebd16c20aad1957932e412e0306698ee1cd36e1c1c45c7ced3d2d"} Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.057145 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" event={"ID":"3b459b52-8877-4a75-a737-2f4c0bec10cf","Type":"ContainerStarted","Data":"360cbb30af21f58add34b3a0ea36d27e8dd8a0e77aee4916f87c32c608964272"} Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.083185 4703 generic.go:334] "Generic (PLEG): container finished" podID="56622617-929d-4188-9de5-94c37d8aac41" containerID="64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f" exitCode=0 Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.083408 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"56622617-929d-4188-9de5-94c37d8aac41","Type":"ContainerDied","Data":"64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f"} Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.089530 4703 generic.go:334] "Generic (PLEG): container finished" podID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerID="68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f" exitCode=0 Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.089716 4703 generic.go:334] "Generic (PLEG): container finished" podID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerID="dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f" exitCode=1 Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.089799 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"ba685b9e-70a5-4243-aa32-700e4d0f8f89","Type":"ContainerDied","Data":"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f"} Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.089910 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"ba685b9e-70a5-4243-aa32-700e4d0f8f89","Type":"ContainerDied","Data":"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f"} Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.089984 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-share-share0-0" event={"ID":"ba685b9e-70a5-4243-aa32-700e4d0f8f89","Type":"ContainerDied","Data":"b7c1bbbc82b9020008914f10503210ea1eb78269f27efea2779dfacb7ef436d5"} Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.090062 4703 scope.go:117] "RemoveContainer" containerID="68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.090267 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-share-share0-0" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.099291 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" podStartSLOduration=2.099275488 podStartE2EDuration="2.099275488s" podCreationTimestamp="2026-03-09 13:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:35.097388877 +0000 UTC m=+1471.064804563" watchObservedRunningTime="2026-03-09 13:44:35.099275488 +0000 UTC m=+1471.066691184" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.120750 4703 scope.go:117] "RemoveContainer" containerID="dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.139145 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.147259 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-share-share0-0"] Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.150035 4703 scope.go:117] "RemoveContainer" containerID="68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f" Mar 09 13:44:35 crc kubenswrapper[4703]: E0309 13:44:35.152227 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f\": container with ID starting with 68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f not found: ID does not exist" containerID="68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.152266 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f"} err="failed to get container status \"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f\": rpc error: code = NotFound desc = could not find container \"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f\": container with ID starting with 68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f not found: ID does not exist" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.152286 4703 scope.go:117] "RemoveContainer" containerID="dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f" Mar 09 13:44:35 crc kubenswrapper[4703]: E0309 13:44:35.152921 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f\": container with ID starting with dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f not found: ID does not exist" containerID="dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.152944 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f"} err="failed to get container status \"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f\": rpc error: code = NotFound desc = could not find container \"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f\": container with ID starting with dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f not found: ID does not exist" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.152965 4703 scope.go:117] "RemoveContainer" containerID="68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.154079 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f"} err="failed to get container status \"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f\": rpc error: code = NotFound desc = could not find container \"68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f\": container with ID starting with 68d8a6999bbf5b3109140619c1fb7b45b251c1be298e0d98ff926dd8a84fd97f not found: ID does not exist" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.154118 4703 scope.go:117] "RemoveContainer" containerID="dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.154455 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f"} err="failed to get container status \"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f\": rpc error: code = NotFound desc = could not find container \"dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f\": container with ID starting with dab58e0b4e28c0098ffd7f3679c260d56fcc89eeb54af2e1b9a336c4ebdec92f not found: ID does not exist" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.239288 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.353696 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-apiservice-cert\") pod \"9b93aff6-9e4d-446c-9882-b572d92049db\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.353758 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5kxx\" (UniqueName: \"kubernetes.io/projected/9b93aff6-9e4d-446c-9882-b572d92049db-kube-api-access-z5kxx\") pod \"9b93aff6-9e4d-446c-9882-b572d92049db\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.353823 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-webhook-cert\") pod \"9b93aff6-9e4d-446c-9882-b572d92049db\" (UID: \"9b93aff6-9e4d-446c-9882-b572d92049db\") " Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.358975 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "9b93aff6-9e4d-446c-9882-b572d92049db" (UID: "9b93aff6-9e4d-446c-9882-b572d92049db"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.360937 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "9b93aff6-9e4d-446c-9882-b572d92049db" (UID: "9b93aff6-9e4d-446c-9882-b572d92049db"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.365360 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b93aff6-9e4d-446c-9882-b572d92049db-kube-api-access-z5kxx" (OuterVolumeSpecName: "kube-api-access-z5kxx") pod "9b93aff6-9e4d-446c-9882-b572d92049db" (UID: "9b93aff6-9e4d-446c-9882-b572d92049db"). InnerVolumeSpecName "kube-api-access-z5kxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.455782 4703 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.455813 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5kxx\" (UniqueName: \"kubernetes.io/projected/9b93aff6-9e4d-446c-9882-b572d92049db-kube-api-access-z5kxx\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.455825 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b93aff6-9e4d-446c-9882-b572d92049db-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.505928 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.658770 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r2kv\" (UniqueName: \"kubernetes.io/projected/029a9f28-5caa-45af-872a-59de8aa8cdbe-kube-api-access-2r2kv\") pod \"029a9f28-5caa-45af-872a-59de8aa8cdbe\" (UID: \"029a9f28-5caa-45af-872a-59de8aa8cdbe\") " Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.663649 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/029a9f28-5caa-45af-872a-59de8aa8cdbe-kube-api-access-2r2kv" (OuterVolumeSpecName: "kube-api-access-2r2kv") pod "029a9f28-5caa-45af-872a-59de8aa8cdbe" (UID: "029a9f28-5caa-45af-872a-59de8aa8cdbe"). InnerVolumeSpecName "kube-api-access-2r2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:35 crc kubenswrapper[4703]: I0309 13:44:35.760247 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r2kv\" (UniqueName: \"kubernetes.io/projected/029a9f28-5caa-45af-872a-59de8aa8cdbe-kube-api-access-2r2kv\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.097812 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-index-nldr4" event={"ID":"029a9f28-5caa-45af-872a-59de8aa8cdbe","Type":"ContainerDied","Data":"faef0dd36467d07c3f6c9959d276eaa9c8990a234d7d334b88beef7444e1e97e"} Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.097894 4703 scope.go:117] "RemoveContainer" containerID="8231820edd08e43e7a8b081acd5f16398c424a1f1e8cc3f009892d03b5089780" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.097920 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-index-nldr4" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.100081 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" event={"ID":"9b93aff6-9e4d-446c-9882-b572d92049db","Type":"ContainerDied","Data":"68e06b1824eb1382eeae01eb25ebdf1f1cf574506a631784145f7bb6418bb8c2"} Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.100162 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.103796 4703 generic.go:334] "Generic (PLEG): container finished" podID="3b459b52-8877-4a75-a737-2f4c0bec10cf" containerID="991fe9f417eebd16c20aad1957932e412e0306698ee1cd36e1c1c45c7ced3d2d" exitCode=0 Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.103894 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" event={"ID":"3b459b52-8877-4a75-a737-2f4c0bec10cf","Type":"ContainerDied","Data":"991fe9f417eebd16c20aad1957932e412e0306698ee1cd36e1c1c45c7ced3d2d"} Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.121596 4703 scope.go:117] "RemoveContainer" containerID="082ebc0ed6faefa2e0d09e8c2a5d615882f29741f9d792d2981831bff9706549" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.162571 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-index-nldr4"] Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.180266 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/manila-operator-index-nldr4"] Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.188176 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf"] Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.195049 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7b694485dc-62gvf"] Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.716378 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="029a9f28-5caa-45af-872a-59de8aa8cdbe" path="/var/lib/kubelet/pods/029a9f28-5caa-45af-872a-59de8aa8cdbe/volumes" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.717350 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c9d07a-5579-487a-8ef5-b86c849166ff" path="/var/lib/kubelet/pods/44c9d07a-5579-487a-8ef5-b86c849166ff/volumes" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.718044 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b93aff6-9e4d-446c-9882-b572d92049db" path="/var/lib/kubelet/pods/9b93aff6-9e4d-446c-9882-b572d92049db/volumes" Mar 09 13:44:36 crc kubenswrapper[4703]: I0309 13:44:36.719048 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" path="/var/lib/kubelet/pods/ba685b9e-70a5-4243-aa32-700e4d0f8f89/volumes" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.114426 4703 generic.go:334] "Generic (PLEG): container finished" podID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerID="25ef4f6013eeab6e741bbea4a470a5a87a69f29ae644316453a11829bd5c060b" exitCode=0 Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.114525 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"5c3886f6-ff37-4519-9eaa-ae33f1121b7a","Type":"ContainerDied","Data":"25ef4f6013eeab6e741bbea4a470a5a87a69f29ae644316453a11829bd5c060b"} Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.402448 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.594084 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm9gq\" (UniqueName: \"kubernetes.io/projected/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-kube-api-access-nm9gq\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.594592 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.594636 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-internal-tls-certs\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.594693 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data-custom\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.594758 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-etc-machine-id\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.594853 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-combined-ca-bundle\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.594895 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.595535 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-logs\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.595611 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-public-tls-certs\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.595666 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-scripts\") pod \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\" (UID: \"5c3886f6-ff37-4519-9eaa-ae33f1121b7a\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.595985 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.596076 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-logs" (OuterVolumeSpecName: "logs") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.600105 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-kube-api-access-nm9gq" (OuterVolumeSpecName: "kube-api-access-nm9gq") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "kube-api-access-nm9gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.600617 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-scripts" (OuterVolumeSpecName: "scripts") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.600712 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.609788 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.620386 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.644480 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data" (OuterVolumeSpecName: "config-data") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.648432 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.663666 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5c3886f6-ff37-4519-9eaa-ae33f1121b7a" (UID: "5c3886f6-ff37-4519-9eaa-ae33f1121b7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698062 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698277 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm9gq\" (UniqueName: \"kubernetes.io/projected/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-kube-api-access-nm9gq\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698334 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698385 4703 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698433 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698492 4703 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698604 4703 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.698666 4703 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3886f6-ff37-4519-9eaa-ae33f1121b7a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.800264 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b459b52-8877-4a75-a737-2f4c0bec10cf-operator-scripts\") pod \"3b459b52-8877-4a75-a737-2f4c0bec10cf\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.800317 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5f7j\" (UniqueName: \"kubernetes.io/projected/3b459b52-8877-4a75-a737-2f4c0bec10cf-kube-api-access-x5f7j\") pod \"3b459b52-8877-4a75-a737-2f4c0bec10cf\" (UID: \"3b459b52-8877-4a75-a737-2f4c0bec10cf\") " Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.801360 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b459b52-8877-4a75-a737-2f4c0bec10cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b459b52-8877-4a75-a737-2f4c0bec10cf" (UID: "3b459b52-8877-4a75-a737-2f4c0bec10cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.806181 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b459b52-8877-4a75-a737-2f4c0bec10cf-kube-api-access-x5f7j" (OuterVolumeSpecName: "kube-api-access-x5f7j") pod "3b459b52-8877-4a75-a737-2f4c0bec10cf" (UID: "3b459b52-8877-4a75-a737-2f4c0bec10cf"). InnerVolumeSpecName "kube-api-access-x5f7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.816249 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.902055 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5f7j\" (UniqueName: \"kubernetes.io/projected/3b459b52-8877-4a75-a737-2f4c0bec10cf-kube-api-access-x5f7j\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:37 crc kubenswrapper[4703]: I0309 13:44:37.902089 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b459b52-8877-4a75-a737-2f4c0bec10cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.002812 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data-custom\") pod \"56622617-929d-4188-9de5-94c37d8aac41\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.002952 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56622617-929d-4188-9de5-94c37d8aac41-etc-machine-id\") pod \"56622617-929d-4188-9de5-94c37d8aac41\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.003003 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7v2w\" (UniqueName: \"kubernetes.io/projected/56622617-929d-4188-9de5-94c37d8aac41-kube-api-access-m7v2w\") pod \"56622617-929d-4188-9de5-94c37d8aac41\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.003077 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data\") pod \"56622617-929d-4188-9de5-94c37d8aac41\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.003112 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-scripts\") pod \"56622617-929d-4188-9de5-94c37d8aac41\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.003109 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56622617-929d-4188-9de5-94c37d8aac41-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "56622617-929d-4188-9de5-94c37d8aac41" (UID: "56622617-929d-4188-9de5-94c37d8aac41"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.003144 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-combined-ca-bundle\") pod \"56622617-929d-4188-9de5-94c37d8aac41\" (UID: \"56622617-929d-4188-9de5-94c37d8aac41\") " Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.003531 4703 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56622617-929d-4188-9de5-94c37d8aac41-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.007996 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56622617-929d-4188-9de5-94c37d8aac41-kube-api-access-m7v2w" (OuterVolumeSpecName: "kube-api-access-m7v2w") pod "56622617-929d-4188-9de5-94c37d8aac41" (UID: "56622617-929d-4188-9de5-94c37d8aac41"). InnerVolumeSpecName "kube-api-access-m7v2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.008682 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-scripts" (OuterVolumeSpecName: "scripts") pod "56622617-929d-4188-9de5-94c37d8aac41" (UID: "56622617-929d-4188-9de5-94c37d8aac41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.008803 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "56622617-929d-4188-9de5-94c37d8aac41" (UID: "56622617-929d-4188-9de5-94c37d8aac41"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.052182 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56622617-929d-4188-9de5-94c37d8aac41" (UID: "56622617-929d-4188-9de5-94c37d8aac41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.070888 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data" (OuterVolumeSpecName: "config-data") pod "56622617-929d-4188-9de5-94c37d8aac41" (UID: "56622617-929d-4188-9de5-94c37d8aac41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.104939 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7v2w\" (UniqueName: \"kubernetes.io/projected/56622617-929d-4188-9de5-94c37d8aac41-kube-api-access-m7v2w\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.104990 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.105008 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.105026 4703 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.105046 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/56622617-929d-4188-9de5-94c37d8aac41-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.125486 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" event={"ID":"3b459b52-8877-4a75-a737-2f4c0bec10cf","Type":"ContainerDied","Data":"360cbb30af21f58add34b3a0ea36d27e8dd8a0e77aee4916f87c32c608964272"} Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.125525 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="360cbb30af21f58add34b3a0ea36d27e8dd8a0e77aee4916f87c32c608964272" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.125581 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manilab5f3-account-delete-k8sl9" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.128924 4703 generic.go:334] "Generic (PLEG): container finished" podID="56622617-929d-4188-9de5-94c37d8aac41" containerID="b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98" exitCode=0 Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.129066 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-scheduler-0" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.129432 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"56622617-929d-4188-9de5-94c37d8aac41","Type":"ContainerDied","Data":"b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98"} Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.129483 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-scheduler-0" event={"ID":"56622617-929d-4188-9de5-94c37d8aac41","Type":"ContainerDied","Data":"50ee77c47b5aaaae1e3d1ffdc83dea61637ae03bc8140b8f82ee3095a558e3c0"} Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.129506 4703 scope.go:117] "RemoveContainer" containerID="64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.132624 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/manila-api-0" event={"ID":"5c3886f6-ff37-4519-9eaa-ae33f1121b7a","Type":"ContainerDied","Data":"d0fc91fb816ed8c4a29f082ae40b82ba02bb7ecd6bbe6952e8e4c57c9be3f1b5"} Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.132700 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/manila-api-0" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.164163 4703 scope.go:117] "RemoveContainer" containerID="b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.182434 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.193124 4703 scope.go:117] "RemoveContainer" containerID="64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f" Mar 09 13:44:38 crc kubenswrapper[4703]: E0309 13:44:38.193722 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f\": container with ID starting with 64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f not found: ID does not exist" containerID="64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.193796 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f"} err="failed to get container status \"64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f\": rpc error: code = NotFound desc = could not find container \"64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f\": container with ID starting with 64757b41fbe2a81a511f0753cafdc0fc3eadba1980d47860ec7bd92d2fbfc20f not found: ID does not exist" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.193835 4703 scope.go:117] "RemoveContainer" containerID="b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98" Mar 09 13:44:38 crc kubenswrapper[4703]: E0309 13:44:38.194293 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98\": container with ID starting with b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98 not found: ID does not exist" containerID="b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.194321 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98"} err="failed to get container status \"b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98\": rpc error: code = NotFound desc = could not find container \"b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98\": container with ID starting with b42b761c6c310eab1e54b1f0e4c6856792a25b3345752da79c6367ecd00f0f98 not found: ID does not exist" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.194340 4703 scope.go:117] "RemoveContainer" containerID="25ef4f6013eeab6e741bbea4a470a5a87a69f29ae644316453a11829bd5c060b" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.197189 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-scheduler-0"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.202287 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.208373 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-api-0"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.211920 4703 scope.go:117] "RemoveContainer" containerID="6de89c8a16bf75a072a23cbade4d85ccbe06e2a461bc93856149a84ac97f36bd" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.725147 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56622617-929d-4188-9de5-94c37d8aac41" path="/var/lib/kubelet/pods/56622617-929d-4188-9de5-94c37d8aac41/volumes" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.726737 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" path="/var/lib/kubelet/pods/5c3886f6-ff37-4519-9eaa-ae33f1121b7a/volumes" Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.813715 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-db-create-w7clx"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.828029 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-db-create-w7clx"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.846718 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.853425 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/manilab5f3-account-delete-k8sl9"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.859556 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manilab5f3-account-delete-k8sl9"] Mar 09 13:44:38 crc kubenswrapper[4703]: I0309 13:44:38.864375 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/manila-b5f3-account-create-update-w5hc9"] Mar 09 13:44:40 crc kubenswrapper[4703]: I0309 13:44:40.718942 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b459b52-8877-4a75-a737-2f4c0bec10cf" path="/var/lib/kubelet/pods/3b459b52-8877-4a75-a737-2f4c0bec10cf/volumes" Mar 09 13:44:40 crc kubenswrapper[4703]: I0309 13:44:40.720786 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9286dca-288a-4d5b-9bca-bed8960bf88c" path="/var/lib/kubelet/pods/e9286dca-288a-4d5b-9bca-bed8960bf88c/volumes" Mar 09 13:44:40 crc kubenswrapper[4703]: I0309 13:44:40.722081 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f75a999f-5349-47b5-9d88-f746113cf48b" path="/var/lib/kubelet/pods/f75a999f-5349-47b5-9d88-f746113cf48b/volumes" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.881185 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone-db-sync-699rj"] Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.887668 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone-bootstrap-km8qk"] Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.893886 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/keystone-db-sync-699rj"] Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.899619 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/keystone-bootstrap-km8qk"] Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.903977 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone-5b95c486b5-lnbh5"] Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.904194 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" podUID="a9ac0ae5-2538-48ef-bd64-e9887d90ff39" containerName="keystone-api" containerID="cri-o://b73448b46df1614f1432c93f8e29c57755391564c9ee9301fe2e705e381a034a" gracePeriod=30 Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931211 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/keystone94db-account-delete-2c5th"] Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931503 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="manila-share" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931523 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="manila-share" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931540 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b459b52-8877-4a75-a737-2f4c0bec10cf" containerName="mariadb-account-delete" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931550 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b459b52-8877-4a75-a737-2f4c0bec10cf" containerName="mariadb-account-delete" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931570 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="manila-scheduler" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931579 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="manila-scheduler" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931594 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="probe" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931602 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="probe" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931618 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api-log" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931627 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api-log" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931638 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b93aff6-9e4d-446c-9882-b572d92049db" containerName="manager" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931646 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b93aff6-9e4d-446c-9882-b572d92049db" containerName="manager" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931661 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="probe" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931669 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="probe" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931681 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931690 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api" Mar 09 13:44:41 crc kubenswrapper[4703]: E0309 13:44:41.931703 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="029a9f28-5caa-45af-872a-59de8aa8cdbe" containerName="registry-server" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931711 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="029a9f28-5caa-45af-872a-59de8aa8cdbe" containerName="registry-server" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931863 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b93aff6-9e4d-446c-9882-b572d92049db" containerName="manager" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931879 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="029a9f28-5caa-45af-872a-59de8aa8cdbe" containerName="registry-server" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931888 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b459b52-8877-4a75-a737-2f4c0bec10cf" containerName="mariadb-account-delete" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931900 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="manila-share" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931908 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba685b9e-70a5-4243-aa32-700e4d0f8f89" containerName="probe" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931920 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="manila-scheduler" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931929 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="56622617-929d-4188-9de5-94c37d8aac41" containerName="probe" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931937 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.931948 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3886f6-ff37-4519-9eaa-ae33f1121b7a" containerName="manila-api-log" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.932631 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.945113 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone94db-account-delete-2c5th"] Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.975936 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts\") pod \"keystone94db-account-delete-2c5th\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:41 crc kubenswrapper[4703]: I0309 13:44:41.976030 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbvt\" (UniqueName: \"kubernetes.io/projected/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-kube-api-access-gbbvt\") pod \"keystone94db-account-delete-2c5th\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.076653 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbvt\" (UniqueName: \"kubernetes.io/projected/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-kube-api-access-gbbvt\") pod \"keystone94db-account-delete-2c5th\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.076726 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts\") pod \"keystone94db-account-delete-2c5th\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.077386 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts\") pod \"keystone94db-account-delete-2c5th\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.107220 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbvt\" (UniqueName: \"kubernetes.io/projected/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-kube-api-access-gbbvt\") pod \"keystone94db-account-delete-2c5th\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.253065 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.670997 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/keystone94db-account-delete-2c5th"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.721207 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e32ebb9-2351-4afc-9313-5ad8d978bed8" path="/var/lib/kubelet/pods/0e32ebb9-2351-4afc-9313-5ad8d978bed8/volumes" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.722221 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a287bcb-ad78-4309-a44d-deb8e59370e7" path="/var/lib/kubelet/pods/9a287bcb-ad78-4309-a44d-deb8e59370e7/volumes" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.759080 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/root-account-create-update-95mws"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.767808 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/root-account-create-update-95mws"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.782980 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["manila-kuttl-tests/root-account-create-update-2rqpx"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.783957 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.785652 4703 reflector.go:368] Caches populated for *v1.Secret from object-"manila-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.803010 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/openstack-galera-1"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.815101 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/openstack-galera-0"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.821082 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/openstack-galera-2"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.850809 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/root-account-create-update-2rqpx"] Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.863924 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/root-account-create-update-2rqpx"] Mar 09 13:44:42 crc kubenswrapper[4703]: E0309 13:44:42.864382 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-9xwtr operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="manila-kuttl-tests/root-account-create-update-2rqpx" podUID="d3e7a462-f1c1-4b6d-a875-a7f726789508" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.889569 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts\") pod \"root-account-create-update-2rqpx\" (UID: \"d3e7a462-f1c1-4b6d-a875-a7f726789508\") " pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.889781 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwtr\" (UniqueName: \"kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr\") pod \"root-account-create-update-2rqpx\" (UID: \"d3e7a462-f1c1-4b6d-a875-a7f726789508\") " pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.974595 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/openstack-galera-2" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerName="galera" containerID="cri-o://01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f" gracePeriod=30 Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.991782 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts\") pod \"root-account-create-update-2rqpx\" (UID: \"d3e7a462-f1c1-4b6d-a875-a7f726789508\") " pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:42 crc kubenswrapper[4703]: I0309 13:44:42.991877 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwtr\" (UniqueName: \"kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr\") pod \"root-account-create-update-2rqpx\" (UID: \"d3e7a462-f1c1-4b6d-a875-a7f726789508\") " pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:42 crc kubenswrapper[4703]: E0309 13:44:42.991930 4703 configmap.go:193] Couldn't get configMap manila-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 09 13:44:42 crc kubenswrapper[4703]: E0309 13:44:42.992004 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts podName:d3e7a462-f1c1-4b6d-a875-a7f726789508 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:43.491983241 +0000 UTC m=+1479.459398927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts") pod "root-account-create-update-2rqpx" (UID: "d3e7a462-f1c1-4b6d-a875-a7f726789508") : configmap "openstack-scripts" not found Mar 09 13:44:42 crc kubenswrapper[4703]: E0309 13:44:42.997714 4703 projected.go:194] Error preparing data for projected volume kube-api-access-9xwtr for pod manila-kuttl-tests/root-account-create-update-2rqpx: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 13:44:42 crc kubenswrapper[4703]: E0309 13:44:42.997781 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr podName:d3e7a462-f1c1-4b6d-a875-a7f726789508 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:43.497763889 +0000 UTC m=+1479.465179575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9xwtr" (UniqueName: "kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr") pod "root-account-create-update-2rqpx" (UID: "d3e7a462-f1c1-4b6d-a875-a7f726789508") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.172542 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" event={"ID":"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3","Type":"ContainerStarted","Data":"5ad6366905b4ad63706d403d64ffe7703075be0aa35f1138be9e95fad8170496"} Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.172595 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" event={"ID":"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3","Type":"ContainerStarted","Data":"9def804e8e0dbf8efd2f953cdd40a591a39be927b6a57ee51022b956e5241cb8"} Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.172536 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.172961 4703 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" secret="" err="secret \"galera-openstack-dockercfg-thxn8\" not found" Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.191022 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.196097 4703 configmap.go:193] Couldn't get configMap manila-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.196171 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts podName:f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:43.696155094 +0000 UTC m=+1479.663570780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts") pod "keystone94db-account-delete-2c5th" (UID: "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3") : configmap "openstack-scripts" not found Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.500296 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" podStartSLOduration=2.500273064 podStartE2EDuration="2.500273064s" podCreationTimestamp="2026-03-09 13:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:44:43.205864929 +0000 UTC m=+1479.173280615" watchObservedRunningTime="2026-03-09 13:44:43.500273064 +0000 UTC m=+1479.467688750" Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.501435 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts\") pod \"root-account-create-update-2rqpx\" (UID: \"d3e7a462-f1c1-4b6d-a875-a7f726789508\") " pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.501559 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwtr\" (UniqueName: \"kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr\") pod \"root-account-create-update-2rqpx\" (UID: \"d3e7a462-f1c1-4b6d-a875-a7f726789508\") " pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.502121 4703 configmap.go:193] Couldn't get configMap manila-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.502171 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts podName:d3e7a462-f1c1-4b6d-a875-a7f726789508 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:44.502155575 +0000 UTC m=+1480.469571271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts") pod "root-account-create-update-2rqpx" (UID: "d3e7a462-f1c1-4b6d-a875-a7f726789508") : configmap "openstack-scripts" not found Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.504231 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/memcached-0"] Mar 09 13:44:43 crc kubenswrapper[4703]: I0309 13:44:43.504458 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/memcached-0" podUID="9b9d9316-361e-431d-ad43-9fd1a8cb72c1" containerName="memcached" containerID="cri-o://7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c" gracePeriod=30 Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.506903 4703 projected.go:194] Error preparing data for projected volume kube-api-access-9xwtr for pod manila-kuttl-tests/root-account-create-update-2rqpx: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.506981 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr podName:d3e7a462-f1c1-4b6d-a875-a7f726789508 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:44.506961427 +0000 UTC m=+1480.474377113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9xwtr" (UniqueName: "kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr") pod "root-account-create-update-2rqpx" (UID: "d3e7a462-f1c1-4b6d-a875-a7f726789508") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.704093 4703 configmap.go:193] Couldn't get configMap manila-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 09 13:44:43 crc kubenswrapper[4703]: E0309 13:44:43.704150 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts podName:f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:44.704137558 +0000 UTC m=+1480.671553244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts") pod "keystone94db-account-delete-2c5th" (UID: "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3") : configmap "openstack-scripts" not found Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.020858 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["manila-kuttl-tests/rabbitmq-server-0"] Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.048201 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.180244 4703 generic.go:334] "Generic (PLEG): container finished" podID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerID="5ad6366905b4ad63706d403d64ffe7703075be0aa35f1138be9e95fad8170496" exitCode=1 Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.180308 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" event={"ID":"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3","Type":"ContainerDied","Data":"5ad6366905b4ad63706d403d64ffe7703075be0aa35f1138be9e95fad8170496"} Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.180738 4703 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" secret="" err="secret \"galera-openstack-dockercfg-thxn8\" not found" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.180783 4703 scope.go:117] "RemoveContainer" containerID="5ad6366905b4ad63706d403d64ffe7703075be0aa35f1138be9e95fad8170496" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.182979 4703 generic.go:334] "Generic (PLEG): container finished" podID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerID="01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f" exitCode=0 Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.183017 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-2" event={"ID":"03a80206-7d7f-42bc-b3ce-57abda992a6a","Type":"ContainerDied","Data":"01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f"} Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.183032 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-2" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.183169 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-2" event={"ID":"03a80206-7d7f-42bc-b3ce-57abda992a6a","Type":"ContainerDied","Data":"08a72f25cbbb69e93ea3bbe3b3869b3cffe9b41ddac4c01ec0c8bb001043574b"} Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.183197 4703 scope.go:117] "RemoveContainer" containerID="01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.183579 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/root-account-create-update-2rqpx" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.216433 4703 scope.go:117] "RemoveContainer" containerID="02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.223797 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-generated\") pod \"03a80206-7d7f-42bc-b3ce-57abda992a6a\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.223860 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-kolla-config\") pod \"03a80206-7d7f-42bc-b3ce-57abda992a6a\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.223929 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-operator-scripts\") pod \"03a80206-7d7f-42bc-b3ce-57abda992a6a\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.223989 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"03a80206-7d7f-42bc-b3ce-57abda992a6a\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.224039 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-default\") pod \"03a80206-7d7f-42bc-b3ce-57abda992a6a\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.224057 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kpcl\" (UniqueName: \"kubernetes.io/projected/03a80206-7d7f-42bc-b3ce-57abda992a6a-kube-api-access-7kpcl\") pod \"03a80206-7d7f-42bc-b3ce-57abda992a6a\" (UID: \"03a80206-7d7f-42bc-b3ce-57abda992a6a\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.225422 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "03a80206-7d7f-42bc-b3ce-57abda992a6a" (UID: "03a80206-7d7f-42bc-b3ce-57abda992a6a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.226500 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "03a80206-7d7f-42bc-b3ce-57abda992a6a" (UID: "03a80206-7d7f-42bc-b3ce-57abda992a6a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.226565 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "03a80206-7d7f-42bc-b3ce-57abda992a6a" (UID: "03a80206-7d7f-42bc-b3ce-57abda992a6a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.226848 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03a80206-7d7f-42bc-b3ce-57abda992a6a" (UID: "03a80206-7d7f-42bc-b3ce-57abda992a6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.228810 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/root-account-create-update-2rqpx"] Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.231041 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a80206-7d7f-42bc-b3ce-57abda992a6a-kube-api-access-7kpcl" (OuterVolumeSpecName: "kube-api-access-7kpcl") pod "03a80206-7d7f-42bc-b3ce-57abda992a6a" (UID: "03a80206-7d7f-42bc-b3ce-57abda992a6a"). InnerVolumeSpecName "kube-api-access-7kpcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.237736 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/root-account-create-update-2rqpx"] Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.239119 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "03a80206-7d7f-42bc-b3ce-57abda992a6a" (UID: "03a80206-7d7f-42bc-b3ce-57abda992a6a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.281036 4703 scope.go:117] "RemoveContainer" containerID="01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f" Mar 09 13:44:44 crc kubenswrapper[4703]: E0309 13:44:44.281493 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f\": container with ID starting with 01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f not found: ID does not exist" containerID="01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.281544 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f"} err="failed to get container status \"01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f\": rpc error: code = NotFound desc = could not find container \"01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f\": container with ID starting with 01a3a0a2b8c2cf662f72798704bb023d87fb809535be722e6abdc379c5c2a58f not found: ID does not exist" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.281570 4703 scope.go:117] "RemoveContainer" containerID="02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235" Mar 09 13:44:44 crc kubenswrapper[4703]: E0309 13:44:44.281833 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235\": container with ID starting with 02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235 not found: ID does not exist" containerID="02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.281855 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235"} err="failed to get container status \"02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235\": rpc error: code = NotFound desc = could not find container \"02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235\": container with ID starting with 02ab23f31789f50d5c3f466ca3c74093bc62fc11f46ec2488e638ace358ce235 not found: ID does not exist" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.325619 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.325922 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.325935 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kpcl\" (UniqueName: \"kubernetes.io/projected/03a80206-7d7f-42bc-b3ce-57abda992a6a-kube-api-access-7kpcl\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.325946 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/03a80206-7d7f-42bc-b3ce-57abda992a6a-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.325955 4703 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.325962 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03a80206-7d7f-42bc-b3ce-57abda992a6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.339462 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.410454 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/rabbitmq-server-0"] Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.428076 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e7a462-f1c1-4b6d-a875-a7f726789508-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.428110 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xwtr\" (UniqueName: \"kubernetes.io/projected/d3e7a462-f1c1-4b6d-a875-a7f726789508-kube-api-access-9xwtr\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.428122 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.459335 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/rabbitmq-server-0" podUID="379d4845-913a-4422-915c-221497738cde" containerName="rabbitmq" containerID="cri-o://07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1" gracePeriod=604800 Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.495444 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/memcached-0" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.508653 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/openstack-galera-2"] Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.512427 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/openstack-galera-2"] Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.629820 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kolla-config\") pod \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.629869 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b77qp\" (UniqueName: \"kubernetes.io/projected/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kube-api-access-b77qp\") pod \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.629891 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-config-data\") pod \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\" (UID: \"9b9d9316-361e-431d-ad43-9fd1a8cb72c1\") " Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.630313 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9b9d9316-361e-431d-ad43-9fd1a8cb72c1" (UID: "9b9d9316-361e-431d-ad43-9fd1a8cb72c1"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.630664 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-config-data" (OuterVolumeSpecName: "config-data") pod "9b9d9316-361e-431d-ad43-9fd1a8cb72c1" (UID: "9b9d9316-361e-431d-ad43-9fd1a8cb72c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.634729 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kube-api-access-b77qp" (OuterVolumeSpecName: "kube-api-access-b77qp") pod "9b9d9316-361e-431d-ad43-9fd1a8cb72c1" (UID: "9b9d9316-361e-431d-ad43-9fd1a8cb72c1"). InnerVolumeSpecName "kube-api-access-b77qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.714578 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" path="/var/lib/kubelet/pods/03a80206-7d7f-42bc-b3ce-57abda992a6a/volumes" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.715188 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c10cc726-da66-44d2-9dc8-7af3f4afce0e" path="/var/lib/kubelet/pods/c10cc726-da66-44d2-9dc8-7af3f4afce0e/volumes" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.715535 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e7a462-f1c1-4b6d-a875-a7f726789508" path="/var/lib/kubelet/pods/d3e7a462-f1c1-4b6d-a875-a7f726789508/volumes" Mar 09 13:44:44 crc kubenswrapper[4703]: E0309 13:44:44.731304 4703 configmap.go:193] Couldn't get configMap manila-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.731780 4703 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: E0309 13:44:44.731953 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts podName:f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:46.73193058 +0000 UTC m=+1482.699346266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts") pod "keystone94db-account-delete-2c5th" (UID: "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3") : configmap "openstack-scripts" not found Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.732482 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b77qp\" (UniqueName: \"kubernetes.io/projected/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-kube-api-access-b77qp\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.732504 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b9d9316-361e-431d-ad43-9fd1a8cb72c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.895911 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/ceph"] Mar 09 13:44:44 crc kubenswrapper[4703]: I0309 13:44:44.896136 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/ceph" podUID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" containerName="ceph" containerID="cri-o://d7b39f5cef1fc23c2e91878c4692c968d24bdb7697cd879ccc1febd9f9cad2aa" gracePeriod=30 Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.007965 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/openstack-galera-1" podUID="06f52029-696b-414e-a98e-0266b9f71c15" containerName="galera" containerID="cri-o://626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" gracePeriod=28 Mar 09 13:44:45 crc kubenswrapper[4703]: E0309 13:44:45.016347 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 13:44:45 crc kubenswrapper[4703]: E0309 13:44:45.019007 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 13:44:45 crc kubenswrapper[4703]: E0309 13:44:45.021152 4703 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 13:44:45 crc kubenswrapper[4703]: E0309 13:44:45.021237 4703 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="manila-kuttl-tests/openstack-galera-1" podUID="06f52029-696b-414e-a98e-0266b9f71c15" containerName="galera" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.200179 4703 generic.go:334] "Generic (PLEG): container finished" podID="9b9d9316-361e-431d-ad43-9fd1a8cb72c1" containerID="7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c" exitCode=0 Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.200444 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/memcached-0" event={"ID":"9b9d9316-361e-431d-ad43-9fd1a8cb72c1","Type":"ContainerDied","Data":"7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c"} Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.200475 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/memcached-0" event={"ID":"9b9d9316-361e-431d-ad43-9fd1a8cb72c1","Type":"ContainerDied","Data":"917b2310b7527e63cea3f99f09685f63e7d8b91fae0c52252d86c8a8543d368b"} Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.200492 4703 scope.go:117] "RemoveContainer" containerID="7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.200591 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/memcached-0" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.220096 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/memcached-0"] Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.220750 4703 generic.go:334] "Generic (PLEG): container finished" podID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerID="b85350d8e4e126a21b0fbb8b871e01616ab138f3aeba7bee1dfdc931f450f2e9" exitCode=1 Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.221118 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" event={"ID":"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3","Type":"ContainerDied","Data":"b85350d8e4e126a21b0fbb8b871e01616ab138f3aeba7bee1dfdc931f450f2e9"} Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.221435 4703 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" secret="" err="secret \"galera-openstack-dockercfg-thxn8\" not found" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.221479 4703 scope.go:117] "RemoveContainer" containerID="b85350d8e4e126a21b0fbb8b871e01616ab138f3aeba7bee1dfdc931f450f2e9" Mar 09 13:44:45 crc kubenswrapper[4703]: E0309 13:44:45.221900 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystone94db-account-delete-2c5th_manila-kuttl-tests(f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3)\"" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.223218 4703 generic.go:334] "Generic (PLEG): container finished" podID="a9ac0ae5-2538-48ef-bd64-e9887d90ff39" containerID="b73448b46df1614f1432c93f8e29c57755391564c9ee9301fe2e705e381a034a" exitCode=0 Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.223282 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" event={"ID":"a9ac0ae5-2538-48ef-bd64-e9887d90ff39","Type":"ContainerDied","Data":"b73448b46df1614f1432c93f8e29c57755391564c9ee9301fe2e705e381a034a"} Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.228881 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/memcached-0"] Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.280507 4703 scope.go:117] "RemoveContainer" containerID="7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c" Mar 09 13:44:45 crc kubenswrapper[4703]: E0309 13:44:45.287041 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c\": container with ID starting with 7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c not found: ID does not exist" containerID="7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.287078 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c"} err="failed to get container status \"7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c\": rpc error: code = NotFound desc = could not find container \"7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c\": container with ID starting with 7b5dc6f0c381a8f635d088cf9a106ea89eb45e6865dbc15c2ecd1044338bfd9c not found: ID does not exist" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.287102 4703 scope.go:117] "RemoveContainer" containerID="5ad6366905b4ad63706d403d64ffe7703075be0aa35f1138be9e95fad8170496" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.525414 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.547033 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-scripts\") pod \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.547114 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-config-data\") pod \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.547193 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-fernet-keys\") pod \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.547281 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-credential-keys\") pod \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.547403 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvmsl\" (UniqueName: \"kubernetes.io/projected/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-kube-api-access-lvmsl\") pod \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\" (UID: \"a9ac0ae5-2538-48ef-bd64-e9887d90ff39\") " Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.554728 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a9ac0ae5-2538-48ef-bd64-e9887d90ff39" (UID: "a9ac0ae5-2538-48ef-bd64-e9887d90ff39"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.555291 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a9ac0ae5-2538-48ef-bd64-e9887d90ff39" (UID: "a9ac0ae5-2538-48ef-bd64-e9887d90ff39"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.562663 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-scripts" (OuterVolumeSpecName: "scripts") pod "a9ac0ae5-2538-48ef-bd64-e9887d90ff39" (UID: "a9ac0ae5-2538-48ef-bd64-e9887d90ff39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.564681 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-kube-api-access-lvmsl" (OuterVolumeSpecName: "kube-api-access-lvmsl") pod "a9ac0ae5-2538-48ef-bd64-e9887d90ff39" (UID: "a9ac0ae5-2538-48ef-bd64-e9887d90ff39"). InnerVolumeSpecName "kube-api-access-lvmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.592021 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-config-data" (OuterVolumeSpecName: "config-data") pod "a9ac0ae5-2538-48ef-bd64-e9887d90ff39" (UID: "a9ac0ae5-2538-48ef-bd64-e9887d90ff39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.649574 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvmsl\" (UniqueName: \"kubernetes.io/projected/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-kube-api-access-lvmsl\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.649647 4703 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.649659 4703 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.649669 4703 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:45 crc kubenswrapper[4703]: I0309 13:44:45.649679 4703 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ac0ae5-2538-48ef-bd64-e9887d90ff39-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.014721 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162361 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-plugins\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162397 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/379d4845-913a-4422-915c-221497738cde-plugins-conf\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162522 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162552 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx6n2\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-kube-api-access-nx6n2\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162576 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-erlang-cookie\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162651 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-rabbitmq-confd\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162671 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/379d4845-913a-4422-915c-221497738cde-pod-info\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.162701 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/379d4845-913a-4422-915c-221497738cde-erlang-cookie-secret\") pod \"379d4845-913a-4422-915c-221497738cde\" (UID: \"379d4845-913a-4422-915c-221497738cde\") " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.163987 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.164262 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379d4845-913a-4422-915c-221497738cde-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.164599 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.165760 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-kube-api-access-nx6n2" (OuterVolumeSpecName: "kube-api-access-nx6n2") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "kube-api-access-nx6n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.167988 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/379d4845-913a-4422-915c-221497738cde-pod-info" (OuterVolumeSpecName: "pod-info") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.173043 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/379d4845-913a-4422-915c-221497738cde-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.177362 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c" (OuterVolumeSpecName: "persistence") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "pvc-2bee0633-6abb-4275-a483-296d2526d79c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.232719 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" event={"ID":"a9ac0ae5-2538-48ef-bd64-e9887d90ff39","Type":"ContainerDied","Data":"6da66cdd0c3a51d41a6f98989457a7bbb686fab0ecb381ae740ab69453534247"} Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.232775 4703 scope.go:117] "RemoveContainer" containerID="b73448b46df1614f1432c93f8e29c57755391564c9ee9301fe2e705e381a034a" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.232783 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone-5b95c486b5-lnbh5" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.234108 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "379d4845-913a-4422-915c-221497738cde" (UID: "379d4845-913a-4422-915c-221497738cde"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.236429 4703 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" secret="" err="secret \"galera-openstack-dockercfg-thxn8\" not found" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.236475 4703 scope.go:117] "RemoveContainer" containerID="b85350d8e4e126a21b0fbb8b871e01616ab138f3aeba7bee1dfdc931f450f2e9" Mar 09 13:44:46 crc kubenswrapper[4703]: E0309 13:44:46.236752 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystone94db-account-delete-2c5th_manila-kuttl-tests(f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3)\"" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.241218 4703 generic.go:334] "Generic (PLEG): container finished" podID="379d4845-913a-4422-915c-221497738cde" containerID="07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1" exitCode=0 Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.241259 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/rabbitmq-server-0" event={"ID":"379d4845-913a-4422-915c-221497738cde","Type":"ContainerDied","Data":"07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1"} Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.241303 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/rabbitmq-server-0" event={"ID":"379d4845-913a-4422-915c-221497738cde","Type":"ContainerDied","Data":"0eeef2538e59eb0f5e0ef2d63f194b251cd0056f86012bb0b11d28704cf3d733"} Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.241324 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/rabbitmq-server-0" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.259076 4703 scope.go:117] "RemoveContainer" containerID="07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.264549 4703 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/379d4845-913a-4422-915c-221497738cde-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.264583 4703 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.264596 4703 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/379d4845-913a-4422-915c-221497738cde-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.264624 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2bee0633-6abb-4275-a483-296d2526d79c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c\") on node \"crc\" " Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.264638 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx6n2\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-kube-api-access-nx6n2\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.264651 4703 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/379d4845-913a-4422-915c-221497738cde-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.264662 4703 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/379d4845-913a-4422-915c-221497738cde-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.265076 4703 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/379d4845-913a-4422-915c-221497738cde-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.293789 4703 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.294090 4703 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2bee0633-6abb-4275-a483-296d2526d79c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c") on node "crc" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.294897 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone-5b95c486b5-lnbh5"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.299916 4703 scope.go:117] "RemoveContainer" containerID="b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.301150 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/keystone-5b95c486b5-lnbh5"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.324927 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/rabbitmq-server-0"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.330042 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/rabbitmq-server-0"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.335420 4703 scope.go:117] "RemoveContainer" containerID="07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1" Mar 09 13:44:46 crc kubenswrapper[4703]: E0309 13:44:46.336011 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1\": container with ID starting with 07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1 not found: ID does not exist" containerID="07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.336041 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1"} err="failed to get container status \"07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1\": rpc error: code = NotFound desc = could not find container \"07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1\": container with ID starting with 07c3b6fc10b62a526fc576929a048d2184716d85f04a4b3dd6d74917004eb4f1 not found: ID does not exist" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.336065 4703 scope.go:117] "RemoveContainer" containerID="b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa" Mar 09 13:44:46 crc kubenswrapper[4703]: E0309 13:44:46.338422 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa\": container with ID starting with b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa not found: ID does not exist" containerID="b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.338458 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa"} err="failed to get container status \"b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa\": rpc error: code = NotFound desc = could not find container \"b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa\": container with ID starting with b80aeb6f8da13444f0021e8f79e848e61d3e5777cdafbc5520d0d0cb340226fa not found: ID does not exist" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.365699 4703 reconciler_common.go:293] "Volume detached for volume \"pvc-2bee0633-6abb-4275-a483-296d2526d79c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2bee0633-6abb-4275-a483-296d2526d79c\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.716918 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379d4845-913a-4422-915c-221497738cde" path="/var/lib/kubelet/pods/379d4845-913a-4422-915c-221497738cde/volumes" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.718396 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b9d9316-361e-431d-ad43-9fd1a8cb72c1" path="/var/lib/kubelet/pods/9b9d9316-361e-431d-ad43-9fd1a8cb72c1/volumes" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.719607 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ac0ae5-2538-48ef-bd64-e9887d90ff39" path="/var/lib/kubelet/pods/a9ac0ae5-2538-48ef-bd64-e9887d90ff39/volumes" Mar 09 13:44:46 crc kubenswrapper[4703]: E0309 13:44:46.770013 4703 configmap.go:193] Couldn't get configMap manila-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Mar 09 13:44:46 crc kubenswrapper[4703]: E0309 13:44:46.770085 4703 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts podName:f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3 nodeName:}" failed. No retries permitted until 2026-03-09 13:44:50.770068146 +0000 UTC m=+1486.737483832 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts") pod "keystone94db-account-delete-2c5th" (UID: "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3") : configmap "openstack-scripts" not found Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.931337 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.952477 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone-db-create-8bms6"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.954486 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/keystone-db-create-8bms6"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.974088 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone94db-account-delete-2c5th"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.980982 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone-94db-account-create-update-j46q9"] Mar 09 13:44:46 crc kubenswrapper[4703]: I0309 13:44:46.997649 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/keystone-94db-account-create-update-j46q9"] Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.033359 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="manila-kuttl-tests/openstack-galera-0" podUID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerName="galera" containerID="cri-o://316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9" gracePeriod=26 Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.075755 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-config-data-default\") pod \"06f52029-696b-414e-a98e-0266b9f71c15\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.075891 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06f52029-696b-414e-a98e-0266b9f71c15-config-data-generated\") pod \"06f52029-696b-414e-a98e-0266b9f71c15\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.075925 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-operator-scripts\") pod \"06f52029-696b-414e-a98e-0266b9f71c15\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.075953 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"06f52029-696b-414e-a98e-0266b9f71c15\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.076005 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b66gn\" (UniqueName: \"kubernetes.io/projected/06f52029-696b-414e-a98e-0266b9f71c15-kube-api-access-b66gn\") pod \"06f52029-696b-414e-a98e-0266b9f71c15\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.076025 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-kolla-config\") pod \"06f52029-696b-414e-a98e-0266b9f71c15\" (UID: \"06f52029-696b-414e-a98e-0266b9f71c15\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.076555 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f52029-696b-414e-a98e-0266b9f71c15-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "06f52029-696b-414e-a98e-0266b9f71c15" (UID: "06f52029-696b-414e-a98e-0266b9f71c15"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.076613 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "06f52029-696b-414e-a98e-0266b9f71c15" (UID: "06f52029-696b-414e-a98e-0266b9f71c15"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.076705 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "06f52029-696b-414e-a98e-0266b9f71c15" (UID: "06f52029-696b-414e-a98e-0266b9f71c15"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.077897 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06f52029-696b-414e-a98e-0266b9f71c15" (UID: "06f52029-696b-414e-a98e-0266b9f71c15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.082969 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f52029-696b-414e-a98e-0266b9f71c15-kube-api-access-b66gn" (OuterVolumeSpecName: "kube-api-access-b66gn") pod "06f52029-696b-414e-a98e-0266b9f71c15" (UID: "06f52029-696b-414e-a98e-0266b9f71c15"). InnerVolumeSpecName "kube-api-access-b66gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.092447 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "06f52029-696b-414e-a98e-0266b9f71c15" (UID: "06f52029-696b-414e-a98e-0266b9f71c15"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.177724 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/06f52029-696b-414e-a98e-0266b9f71c15-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.177757 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.177786 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.177800 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b66gn\" (UniqueName: \"kubernetes.io/projected/06f52029-696b-414e-a98e-0266b9f71c15-kube-api-access-b66gn\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.177811 4703 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.177821 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/06f52029-696b-414e-a98e-0266b9f71c15-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.188379 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.254664 4703 generic.go:334] "Generic (PLEG): container finished" podID="06f52029-696b-414e-a98e-0266b9f71c15" containerID="626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" exitCode=0 Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.254727 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-1" event={"ID":"06f52029-696b-414e-a98e-0266b9f71c15","Type":"ContainerDied","Data":"626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3"} Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.254764 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-1" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.255216 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-1" event={"ID":"06f52029-696b-414e-a98e-0266b9f71c15","Type":"ContainerDied","Data":"28a6046a870ef50a5d8f8e278e8c4eb4dd9f779df681c6e0f3f65978e6f5834b"} Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.255356 4703 scope.go:117] "RemoveContainer" containerID="626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.279108 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.296238 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/openstack-galera-1"] Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.300035 4703 scope.go:117] "RemoveContainer" containerID="937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.301086 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/openstack-galera-1"] Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.320474 4703 scope.go:117] "RemoveContainer" containerID="626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" Mar 09 13:44:47 crc kubenswrapper[4703]: E0309 13:44:47.321236 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3\": container with ID starting with 626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3 not found: ID does not exist" containerID="626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.321277 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3"} err="failed to get container status \"626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3\": rpc error: code = NotFound desc = could not find container \"626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3\": container with ID starting with 626d204f5e2913a04b5bf04dd0d9137600bbecd6f636d6b80600daa15dfa3fd3 not found: ID does not exist" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.321303 4703 scope.go:117] "RemoveContainer" containerID="937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e" Mar 09 13:44:47 crc kubenswrapper[4703]: E0309 13:44:47.321711 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e\": container with ID starting with 937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e not found: ID does not exist" containerID="937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.321742 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e"} err="failed to get container status \"937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e\": rpc error: code = NotFound desc = could not find container \"937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e\": container with ID starting with 937469ac632bff0dc68baee5aa6e25086b4f030646903883b401fef41fde5b6e not found: ID does not exist" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.500794 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.690389 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts\") pod \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.690827 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbbvt\" (UniqueName: \"kubernetes.io/projected/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-kube-api-access-gbbvt\") pod \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\" (UID: \"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.694291 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" (UID: "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.696023 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-kube-api-access-gbbvt" (OuterVolumeSpecName: "kube-api-access-gbbvt") pod "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" (UID: "f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3"). InnerVolumeSpecName "kube-api-access-gbbvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.792226 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.792266 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbbvt\" (UniqueName: \"kubernetes.io/projected/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3-kube-api-access-gbbvt\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.837721 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.993983 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-operator-scripts\") pod \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.994362 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-generated\") pod \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.994411 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.994451 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hhbh\" (UniqueName: \"kubernetes.io/projected/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kube-api-access-6hhbh\") pod \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.994496 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kolla-config\") pod \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.994615 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-default\") pod \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\" (UID: \"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312\") " Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.994731 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" (UID: "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.995237 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.995320 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" (UID: "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.995451 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" (UID: "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:47 crc kubenswrapper[4703]: I0309 13:44:47.995612 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" (UID: "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.002303 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kube-api-access-6hhbh" (OuterVolumeSpecName: "kube-api-access-6hhbh") pod "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" (UID: "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312"). InnerVolumeSpecName "kube-api-access-6hhbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.007584 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" (UID: "cc5d4c7a-3e87-40f9-9ff9-073fe3de7312"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.096235 4703 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.096290 4703 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.096340 4703 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.096360 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hhbh\" (UniqueName: \"kubernetes.io/projected/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kube-api-access-6hhbh\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.096380 4703 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.117209 4703 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.211416 4703 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.269400 4703 generic.go:334] "Generic (PLEG): container finished" podID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerID="316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9" exitCode=0 Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.269451 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-0" event={"ID":"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312","Type":"ContainerDied","Data":"316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9"} Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.269478 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/openstack-galera-0" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.269586 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/openstack-galera-0" event={"ID":"cc5d4c7a-3e87-40f9-9ff9-073fe3de7312","Type":"ContainerDied","Data":"6f8e27954b34a736c62fac15fd26f03de013d5893e442f5cab5a9c8c600a823e"} Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.269635 4703 scope.go:117] "RemoveContainer" containerID="316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.275086 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" event={"ID":"f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3","Type":"ContainerDied","Data":"9def804e8e0dbf8efd2f953cdd40a591a39be927b6a57ee51022b956e5241cb8"} Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.275181 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/keystone94db-account-delete-2c5th" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.304356 4703 scope.go:117] "RemoveContainer" containerID="31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.309555 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/openstack-galera-0"] Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.314170 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/openstack-galera-0"] Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.325083 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/keystone94db-account-delete-2c5th"] Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.334953 4703 scope.go:117] "RemoveContainer" containerID="316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9" Mar 09 13:44:48 crc kubenswrapper[4703]: E0309 13:44:48.335374 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9\": container with ID starting with 316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9 not found: ID does not exist" containerID="316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.335405 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9"} err="failed to get container status \"316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9\": rpc error: code = NotFound desc = could not find container \"316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9\": container with ID starting with 316ff9945ba6f26880f3367726c47363142cdc5a61befe1efeaf419a030922e9 not found: ID does not exist" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.335424 4703 scope.go:117] "RemoveContainer" containerID="31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757" Mar 09 13:44:48 crc kubenswrapper[4703]: E0309 13:44:48.335746 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757\": container with ID starting with 31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757 not found: ID does not exist" containerID="31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.335764 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757"} err="failed to get container status \"31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757\": rpc error: code = NotFound desc = could not find container \"31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757\": container with ID starting with 31629a0c9aeb79d45ac6353c332a6a5f78596c0dc81792d6cdea346e9de95757 not found: ID does not exist" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.335777 4703 scope.go:117] "RemoveContainer" containerID="b85350d8e4e126a21b0fbb8b871e01616ab138f3aeba7bee1dfdc931f450f2e9" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.340395 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/keystone94db-account-delete-2c5th"] Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.714354 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f52029-696b-414e-a98e-0266b9f71c15" path="/var/lib/kubelet/pods/06f52029-696b-414e-a98e-0266b9f71c15/volumes" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.715006 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1b4f34-e53b-4a26-be67-b31b2b248330" path="/var/lib/kubelet/pods/8c1b4f34-e53b-4a26-be67-b31b2b248330/volumes" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.715581 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" path="/var/lib/kubelet/pods/cc5d4c7a-3e87-40f9-9ff9-073fe3de7312/volumes" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.716702 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" path="/var/lib/kubelet/pods/f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3/volumes" Mar 09 13:44:48 crc kubenswrapper[4703]: I0309 13:44:48.717224 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc27e659-7a64-47b8-9dd2-8b52464efc1c" path="/var/lib/kubelet/pods/fc27e659-7a64-47b8-9dd2-8b52464efc1c/volumes" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.158122 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8"] Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159363 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ac0ae5-2538-48ef-bd64-e9887d90ff39" containerName="keystone-api" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159396 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ac0ae5-2538-48ef-bd64-e9887d90ff39" containerName="keystone-api" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159427 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9d9316-361e-431d-ad43-9fd1a8cb72c1" containerName="memcached" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159445 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9d9316-361e-431d-ad43-9fd1a8cb72c1" containerName="memcached" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159470 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerName="mysql-bootstrap" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159487 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerName="mysql-bootstrap" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159508 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379d4845-913a-4422-915c-221497738cde" containerName="setup-container" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159520 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="379d4845-913a-4422-915c-221497738cde" containerName="setup-container" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159543 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f52029-696b-414e-a98e-0266b9f71c15" containerName="mysql-bootstrap" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159555 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f52029-696b-414e-a98e-0266b9f71c15" containerName="mysql-bootstrap" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159569 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159581 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159596 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f52029-696b-414e-a98e-0266b9f71c15" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159608 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f52029-696b-414e-a98e-0266b9f71c15" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159632 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerName="mysql-bootstrap" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159645 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerName="mysql-bootstrap" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159664 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerName="mariadb-account-delete" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159676 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerName="mariadb-account-delete" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159692 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159703 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159724 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379d4845-913a-4422-915c-221497738cde" containerName="rabbitmq" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159739 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="379d4845-913a-4422-915c-221497738cde" containerName="rabbitmq" Mar 09 13:45:00 crc kubenswrapper[4703]: E0309 13:45:00.159758 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerName="mariadb-account-delete" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.159773 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerName="mariadb-account-delete" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160036 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerName="mariadb-account-delete" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160071 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f52029-696b-414e-a98e-0266b9f71c15" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160093 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="379d4845-913a-4422-915c-221497738cde" containerName="rabbitmq" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160114 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ac0ae5-2538-48ef-bd64-e9887d90ff39" containerName="keystone-api" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160136 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9d9316-361e-431d-ad43-9fd1a8cb72c1" containerName="memcached" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160160 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="03a80206-7d7f-42bc-b3ce-57abda992a6a" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160175 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc5d4c7a-3e87-40f9-9ff9-073fe3de7312" containerName="galera" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.160916 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.168320 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.168565 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.169785 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8"] Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.190053 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbj25\" (UniqueName: \"kubernetes.io/projected/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-kube-api-access-nbj25\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.190500 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-config-volume\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.190608 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-secret-volume\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.291779 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-secret-volume\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.291889 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbj25\" (UniqueName: \"kubernetes.io/projected/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-kube-api-access-nbj25\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.291960 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-config-volume\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.293246 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-config-volume\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.298582 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-secret-volume\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.311495 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbj25\" (UniqueName: \"kubernetes.io/projected/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-kube-api-access-nbj25\") pod \"collect-profiles-29551065-dh4f8\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.490217 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:00 crc kubenswrapper[4703]: I0309 13:45:00.943258 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8"] Mar 09 13:45:01 crc kubenswrapper[4703]: I0309 13:45:01.392329 4703 generic.go:334] "Generic (PLEG): container finished" podID="4f2ff969-dbb6-4152-88e4-bab9ce6ff722" containerID="50deabb6efce260a40b6bbbf4e79f5f68f0c9169e012c935055242589bb07df7" exitCode=0 Mar 09 13:45:01 crc kubenswrapper[4703]: I0309 13:45:01.392467 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" event={"ID":"4f2ff969-dbb6-4152-88e4-bab9ce6ff722","Type":"ContainerDied","Data":"50deabb6efce260a40b6bbbf4e79f5f68f0c9169e012c935055242589bb07df7"} Mar 09 13:45:01 crc kubenswrapper[4703]: I0309 13:45:01.392614 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" event={"ID":"4f2ff969-dbb6-4152-88e4-bab9ce6ff722","Type":"ContainerStarted","Data":"03441189972e2149405285ee4b61ea76d860d05629954cbb8f198d1aeee2708d"} Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.796124 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.830353 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbj25\" (UniqueName: \"kubernetes.io/projected/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-kube-api-access-nbj25\") pod \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.830470 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-config-volume\") pod \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.830508 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-secret-volume\") pod \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\" (UID: \"4f2ff969-dbb6-4152-88e4-bab9ce6ff722\") " Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.831326 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-config-volume" (OuterVolumeSpecName: "config-volume") pod "4f2ff969-dbb6-4152-88e4-bab9ce6ff722" (UID: "4f2ff969-dbb6-4152-88e4-bab9ce6ff722"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.835460 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4f2ff969-dbb6-4152-88e4-bab9ce6ff722" (UID: "4f2ff969-dbb6-4152-88e4-bab9ce6ff722"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.835777 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-kube-api-access-nbj25" (OuterVolumeSpecName: "kube-api-access-nbj25") pod "4f2ff969-dbb6-4152-88e4-bab9ce6ff722" (UID: "4f2ff969-dbb6-4152-88e4-bab9ce6ff722"). InnerVolumeSpecName "kube-api-access-nbj25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.931277 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbj25\" (UniqueName: \"kubernetes.io/projected/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-kube-api-access-nbj25\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.931308 4703 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:02 crc kubenswrapper[4703]: I0309 13:45:02.931317 4703 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f2ff969-dbb6-4152-88e4-bab9ce6ff722-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:03 crc kubenswrapper[4703]: I0309 13:45:03.411357 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" event={"ID":"4f2ff969-dbb6-4152-88e4-bab9ce6ff722","Type":"ContainerDied","Data":"03441189972e2149405285ee4b61ea76d860d05629954cbb8f198d1aeee2708d"} Mar 09 13:45:03 crc kubenswrapper[4703]: I0309 13:45:03.411723 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03441189972e2149405285ee4b61ea76d860d05629954cbb8f198d1aeee2708d" Mar 09 13:45:03 crc kubenswrapper[4703]: I0309 13:45:03.411451 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551065-dh4f8" Mar 09 13:45:09 crc kubenswrapper[4703]: I0309 13:45:09.500273 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:45:09 crc kubenswrapper[4703]: I0309 13:45:09.501044 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.511683 4703 generic.go:334] "Generic (PLEG): container finished" podID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" containerID="d7b39f5cef1fc23c2e91878c4692c968d24bdb7697cd879ccc1febd9f9cad2aa" exitCode=137 Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.511786 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/ceph" event={"ID":"98ec184b-eb8e-4967-b8ec-17cb6f984ccb","Type":"ContainerDied","Data":"d7b39f5cef1fc23c2e91878c4692c968d24bdb7697cd879ccc1febd9f9cad2aa"} Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.512239 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="manila-kuttl-tests/ceph" event={"ID":"98ec184b-eb8e-4967-b8ec-17cb6f984ccb","Type":"ContainerDied","Data":"db0f196c382ce3a2ed3620a900fb64c1d159aa6da429e53dc7c624ca017f33d6"} Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.512258 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db0f196c382ce3a2ed3620a900fb64c1d159aa6da429e53dc7c624ca017f33d6" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.523022 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/ceph" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.604694 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckxm7\" (UniqueName: \"kubernetes.io/projected/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-kube-api-access-ckxm7\") pod \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.604751 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-log\") pod \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.604775 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-data\") pod \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.604806 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-run\") pod \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\" (UID: \"98ec184b-eb8e-4967-b8ec-17cb6f984ccb\") " Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.605541 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-log" (OuterVolumeSpecName: "log") pod "98ec184b-eb8e-4967-b8ec-17cb6f984ccb" (UID: "98ec184b-eb8e-4967-b8ec-17cb6f984ccb"). InnerVolumeSpecName "log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.605572 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-run" (OuterVolumeSpecName: "run") pod "98ec184b-eb8e-4967-b8ec-17cb6f984ccb" (UID: "98ec184b-eb8e-4967-b8ec-17cb6f984ccb"). InnerVolumeSpecName "run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.611183 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-kube-api-access-ckxm7" (OuterVolumeSpecName: "kube-api-access-ckxm7") pod "98ec184b-eb8e-4967-b8ec-17cb6f984ccb" (UID: "98ec184b-eb8e-4967-b8ec-17cb6f984ccb"). InnerVolumeSpecName "kube-api-access-ckxm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.611779 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-data" (OuterVolumeSpecName: "data") pod "98ec184b-eb8e-4967-b8ec-17cb6f984ccb" (UID: "98ec184b-eb8e-4967-b8ec-17cb6f984ccb"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.705774 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckxm7\" (UniqueName: \"kubernetes.io/projected/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-kube-api-access-ckxm7\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.705830 4703 reconciler_common.go:293] "Volume detached for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-log\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.705869 4703 reconciler_common.go:293] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-data\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:15 crc kubenswrapper[4703]: I0309 13:45:15.705884 4703 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/98ec184b-eb8e-4967-b8ec-17cb6f984ccb-run\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:16 crc kubenswrapper[4703]: I0309 13:45:16.520793 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="manila-kuttl-tests/ceph" Mar 09 13:45:16 crc kubenswrapper[4703]: I0309 13:45:16.575672 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["manila-kuttl-tests/ceph"] Mar 09 13:45:16 crc kubenswrapper[4703]: I0309 13:45:16.584758 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["manila-kuttl-tests/ceph"] Mar 09 13:45:16 crc kubenswrapper[4703]: I0309 13:45:16.714664 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" path="/var/lib/kubelet/pods/98ec184b-eb8e-4967-b8ec-17cb6f984ccb/volumes" Mar 09 13:45:18 crc kubenswrapper[4703]: I0309 13:45:18.584440 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2"] Mar 09 13:45:18 crc kubenswrapper[4703]: I0309 13:45:18.585074 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" podUID="ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" containerName="manager" containerID="cri-o://a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083" gracePeriod=10 Mar 09 13:45:18 crc kubenswrapper[4703]: I0309 13:45:18.812565 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-jmzc6"] Mar 09 13:45:18 crc kubenswrapper[4703]: I0309 13:45:18.812783 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-jmzc6" podUID="895726c9-a46b-4503-b195-ef668833c34f" containerName="registry-server" containerID="cri-o://c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab" gracePeriod=30 Mar 09 13:45:18 crc kubenswrapper[4703]: I0309 13:45:18.857214 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg"] Mar 09 13:45:18 crc kubenswrapper[4703]: I0309 13:45:18.880679 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40r2vpg"] Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.035614 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.152232 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-webhook-cert\") pod \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.152303 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rtwn\" (UniqueName: \"kubernetes.io/projected/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-kube-api-access-4rtwn\") pod \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.152336 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-apiservice-cert\") pod \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\" (UID: \"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef\") " Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.176996 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" (UID: "ac75edd0-0cc7-42c5-b1c1-6afe38f7adef"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.181354 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" (UID: "ac75edd0-0cc7-42c5-b1c1-6afe38f7adef"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.187024 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-kube-api-access-4rtwn" (OuterVolumeSpecName: "kube-api-access-4rtwn") pod "ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" (UID: "ac75edd0-0cc7-42c5-b1c1-6afe38f7adef"). InnerVolumeSpecName "kube-api-access-4rtwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.219159 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.254564 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.254812 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rtwn\" (UniqueName: \"kubernetes.io/projected/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-kube-api-access-4rtwn\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.254943 4703 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.355884 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpghl\" (UniqueName: \"kubernetes.io/projected/895726c9-a46b-4503-b195-ef668833c34f-kube-api-access-kpghl\") pod \"895726c9-a46b-4503-b195-ef668833c34f\" (UID: \"895726c9-a46b-4503-b195-ef668833c34f\") " Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.358990 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895726c9-a46b-4503-b195-ef668833c34f-kube-api-access-kpghl" (OuterVolumeSpecName: "kube-api-access-kpghl") pod "895726c9-a46b-4503-b195-ef668833c34f" (UID: "895726c9-a46b-4503-b195-ef668833c34f"). InnerVolumeSpecName "kube-api-access-kpghl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.457210 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpghl\" (UniqueName: \"kubernetes.io/projected/895726c9-a46b-4503-b195-ef668833c34f-kube-api-access-kpghl\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.550576 4703 generic.go:334] "Generic (PLEG): container finished" podID="ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" containerID="a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083" exitCode=0 Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.550708 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.550741 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" event={"ID":"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef","Type":"ContainerDied","Data":"a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083"} Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.550801 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2" event={"ID":"ac75edd0-0cc7-42c5-b1c1-6afe38f7adef","Type":"ContainerDied","Data":"25c28d3a36d9c085afa52b3d6c4e103eef8354f10b2206d36d36715d1d8a2280"} Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.550830 4703 scope.go:117] "RemoveContainer" containerID="a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.553374 4703 generic.go:334] "Generic (PLEG): container finished" podID="895726c9-a46b-4503-b195-ef668833c34f" containerID="c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab" exitCode=0 Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.553444 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jmzc6" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.553468 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jmzc6" event={"ID":"895726c9-a46b-4503-b195-ef668833c34f","Type":"ContainerDied","Data":"c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab"} Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.553794 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jmzc6" event={"ID":"895726c9-a46b-4503-b195-ef668833c34f","Type":"ContainerDied","Data":"c0f3371a30fb2f001ec7194288b1b39d706774be10878b430bc24110c0034373"} Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.581128 4703 scope.go:117] "RemoveContainer" containerID="a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083" Mar 09 13:45:19 crc kubenswrapper[4703]: E0309 13:45:19.581659 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083\": container with ID starting with a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083 not found: ID does not exist" containerID="a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.581706 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083"} err="failed to get container status \"a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083\": rpc error: code = NotFound desc = could not find container \"a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083\": container with ID starting with a99f314dc223c70d8cd12e4db4fec355b6168d0636ab6a9cdbda14738c815083 not found: ID does not exist" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.581738 4703 scope.go:117] "RemoveContainer" containerID="c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.598978 4703 scope.go:117] "RemoveContainer" containerID="c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab" Mar 09 13:45:19 crc kubenswrapper[4703]: E0309 13:45:19.599462 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab\": container with ID starting with c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab not found: ID does not exist" containerID="c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.599492 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab"} err="failed to get container status \"c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab\": rpc error: code = NotFound desc = could not find container \"c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab\": container with ID starting with c4a80f077dc8f6b6f4948a3dbb1162abecd394310108e7a3c7bd239aca6bd5ab not found: ID does not exist" Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.613438 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2"] Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.621341 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-85f6c9db84-6shz2"] Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.627887 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-jmzc6"] Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.633887 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-jmzc6"] Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.908129 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9"] Mar 09 13:45:19 crc kubenswrapper[4703]: I0309 13:45:19.908676 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" podUID="f7c690e1-16e7-4418-b82c-a846f6de3430" containerName="operator" containerID="cri-o://2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2" gracePeriod=10 Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.203076 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lw4f8"] Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.203341 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" podUID="54568517-643e-4d68-bb0b-daf6562bb23d" containerName="registry-server" containerID="cri-o://78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74" gracePeriod=30 Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.228118 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk"] Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.235207 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908dngk"] Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.303547 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.373014 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pltbj\" (UniqueName: \"kubernetes.io/projected/f7c690e1-16e7-4418-b82c-a846f6de3430-kube-api-access-pltbj\") pod \"f7c690e1-16e7-4418-b82c-a846f6de3430\" (UID: \"f7c690e1-16e7-4418-b82c-a846f6de3430\") " Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.377829 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c690e1-16e7-4418-b82c-a846f6de3430-kube-api-access-pltbj" (OuterVolumeSpecName: "kube-api-access-pltbj") pod "f7c690e1-16e7-4418-b82c-a846f6de3430" (UID: "f7c690e1-16e7-4418-b82c-a846f6de3430"). InnerVolumeSpecName "kube-api-access-pltbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.474250 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pltbj\" (UniqueName: \"kubernetes.io/projected/f7c690e1-16e7-4418-b82c-a846f6de3430-kube-api-access-pltbj\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.538108 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.560535 4703 generic.go:334] "Generic (PLEG): container finished" podID="54568517-643e-4d68-bb0b-daf6562bb23d" containerID="78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74" exitCode=0 Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.560573 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" event={"ID":"54568517-643e-4d68-bb0b-daf6562bb23d","Type":"ContainerDied","Data":"78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74"} Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.560611 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" event={"ID":"54568517-643e-4d68-bb0b-daf6562bb23d","Type":"ContainerDied","Data":"3d36ca4826ec0c80883da1069d981a1c8ce207d31bf56778673643c0c3063c3f"} Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.560610 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lw4f8" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.560631 4703 scope.go:117] "RemoveContainer" containerID="78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.562054 4703 generic.go:334] "Generic (PLEG): container finished" podID="f7c690e1-16e7-4418-b82c-a846f6de3430" containerID="2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2" exitCode=0 Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.562103 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" event={"ID":"f7c690e1-16e7-4418-b82c-a846f6de3430","Type":"ContainerDied","Data":"2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2"} Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.562115 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.562119 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9" event={"ID":"f7c690e1-16e7-4418-b82c-a846f6de3430","Type":"ContainerDied","Data":"a736d62c855a173ab0b0b51082f8a8f4b0812170b8a7acfdb7c390d36e998615"} Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.583212 4703 scope.go:117] "RemoveContainer" containerID="78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74" Mar 09 13:45:20 crc kubenswrapper[4703]: E0309 13:45:20.588229 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74\": container with ID starting with 78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74 not found: ID does not exist" containerID="78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.588380 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74"} err="failed to get container status \"78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74\": rpc error: code = NotFound desc = could not find container \"78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74\": container with ID starting with 78a29c1e64ac21ec83effb16efbfe855c543d7f39431248b1d815c5f3837af74 not found: ID does not exist" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.590199 4703 scope.go:117] "RemoveContainer" containerID="2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.600653 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9"] Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.604099 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-wv5q9"] Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.608503 4703 scope.go:117] "RemoveContainer" containerID="2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2" Mar 09 13:45:20 crc kubenswrapper[4703]: E0309 13:45:20.608925 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2\": container with ID starting with 2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2 not found: ID does not exist" containerID="2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.608958 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2"} err="failed to get container status \"2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2\": rpc error: code = NotFound desc = could not find container \"2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2\": container with ID starting with 2c4b7c7293332bae03aadf0e8b1aa3b65ac74e5cee2c3ef14a86cec077857ca2 not found: ID does not exist" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.675564 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm4xk\" (UniqueName: \"kubernetes.io/projected/54568517-643e-4d68-bb0b-daf6562bb23d-kube-api-access-vm4xk\") pod \"54568517-643e-4d68-bb0b-daf6562bb23d\" (UID: \"54568517-643e-4d68-bb0b-daf6562bb23d\") " Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.679013 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54568517-643e-4d68-bb0b-daf6562bb23d-kube-api-access-vm4xk" (OuterVolumeSpecName: "kube-api-access-vm4xk") pod "54568517-643e-4d68-bb0b-daf6562bb23d" (UID: "54568517-643e-4d68-bb0b-daf6562bb23d"). InnerVolumeSpecName "kube-api-access-vm4xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.716458 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27161253-4d46-49d6-a665-87451a02c056" path="/var/lib/kubelet/pods/27161253-4d46-49d6-a665-87451a02c056/volumes" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.717169 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895726c9-a46b-4503-b195-ef668833c34f" path="/var/lib/kubelet/pods/895726c9-a46b-4503-b195-ef668833c34f/volumes" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.717656 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" path="/var/lib/kubelet/pods/ac75edd0-0cc7-42c5-b1c1-6afe38f7adef/volumes" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.718538 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb5a75be-0890-4f1c-988b-ac1f0d2399b3" path="/var/lib/kubelet/pods/eb5a75be-0890-4f1c-988b-ac1f0d2399b3/volumes" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.719075 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c690e1-16e7-4418-b82c-a846f6de3430" path="/var/lib/kubelet/pods/f7c690e1-16e7-4418-b82c-a846f6de3430/volumes" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.777575 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm4xk\" (UniqueName: \"kubernetes.io/projected/54568517-643e-4d68-bb0b-daf6562bb23d-kube-api-access-vm4xk\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.894561 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lw4f8"] Mar 09 13:45:20 crc kubenswrapper[4703]: I0309 13:45:20.902026 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lw4f8"] Mar 09 13:45:22 crc kubenswrapper[4703]: I0309 13:45:22.720397 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54568517-643e-4d68-bb0b-daf6562bb23d" path="/var/lib/kubelet/pods/54568517-643e-4d68-bb0b-daf6562bb23d/volumes" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.179656 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq"] Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.181410 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" podUID="b3ec581a-8cf6-40d6-abfa-347b39c624c2" containerName="manager" containerID="cri-o://c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262" gracePeriod=10 Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.425746 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-tzq9d"] Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.425989 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-tzq9d" podUID="aadbd6bd-930e-473a-851e-820fbfcd22db" containerName="registry-server" containerID="cri-o://3ede56d7dcf6cefa860b43906029fcb5c3f1f6cb99775abcedf9d911b05fe2d0" gracePeriod=30 Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.465531 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p"] Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.477217 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cw2x2p"] Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.582879 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.624170 4703 generic.go:334] "Generic (PLEG): container finished" podID="aadbd6bd-930e-473a-851e-820fbfcd22db" containerID="3ede56d7dcf6cefa860b43906029fcb5c3f1f6cb99775abcedf9d911b05fe2d0" exitCode=0 Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.624218 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-tzq9d" event={"ID":"aadbd6bd-930e-473a-851e-820fbfcd22db","Type":"ContainerDied","Data":"3ede56d7dcf6cefa860b43906029fcb5c3f1f6cb99775abcedf9d911b05fe2d0"} Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.625210 4703 generic.go:334] "Generic (PLEG): container finished" podID="b3ec581a-8cf6-40d6-abfa-347b39c624c2" containerID="c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262" exitCode=0 Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.625231 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" event={"ID":"b3ec581a-8cf6-40d6-abfa-347b39c624c2","Type":"ContainerDied","Data":"c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262"} Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.625244 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" event={"ID":"b3ec581a-8cf6-40d6-abfa-347b39c624c2","Type":"ContainerDied","Data":"952c413be6840ef2ae2f32812f4911f000526aa60facd8a7b8234637e0f2cfe4"} Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.625258 4703 scope.go:117] "RemoveContainer" containerID="c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.625358 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.672032 4703 scope.go:117] "RemoveContainer" containerID="c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262" Mar 09 13:45:26 crc kubenswrapper[4703]: E0309 13:45:26.672437 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262\": container with ID starting with c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262 not found: ID does not exist" containerID="c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.672465 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262"} err="failed to get container status \"c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262\": rpc error: code = NotFound desc = could not find container \"c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262\": container with ID starting with c31a847567ac3c4ad6903e46355455d2559ea249ce7ba506a5e804807c2bd262 not found: ID does not exist" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.713236 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512fe7ca-3877-4ac3-bb40-870e62a89a04" path="/var/lib/kubelet/pods/512fe7ca-3877-4ac3-bb40-870e62a89a04/volumes" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.764236 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-webhook-cert\") pod \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.764635 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-apiservice-cert\") pod \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.764971 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6lbf\" (UniqueName: \"kubernetes.io/projected/b3ec581a-8cf6-40d6-abfa-347b39c624c2-kube-api-access-p6lbf\") pod \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\" (UID: \"b3ec581a-8cf6-40d6-abfa-347b39c624c2\") " Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.769059 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ec581a-8cf6-40d6-abfa-347b39c624c2-kube-api-access-p6lbf" (OuterVolumeSpecName: "kube-api-access-p6lbf") pod "b3ec581a-8cf6-40d6-abfa-347b39c624c2" (UID: "b3ec581a-8cf6-40d6-abfa-347b39c624c2"). InnerVolumeSpecName "kube-api-access-p6lbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.775987 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "b3ec581a-8cf6-40d6-abfa-347b39c624c2" (UID: "b3ec581a-8cf6-40d6-abfa-347b39c624c2"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.786050 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "b3ec581a-8cf6-40d6-abfa-347b39c624c2" (UID: "b3ec581a-8cf6-40d6-abfa-347b39c624c2"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.827535 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.865998 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6lbf\" (UniqueName: \"kubernetes.io/projected/b3ec581a-8cf6-40d6-abfa-347b39c624c2-kube-api-access-p6lbf\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.866065 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.866077 4703 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b3ec581a-8cf6-40d6-abfa-347b39c624c2-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.959051 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq"] Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.963332 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-68c564b879-m5wgq"] Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.967606 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfcrc\" (UniqueName: \"kubernetes.io/projected/aadbd6bd-930e-473a-851e-820fbfcd22db-kube-api-access-xfcrc\") pod \"aadbd6bd-930e-473a-851e-820fbfcd22db\" (UID: \"aadbd6bd-930e-473a-851e-820fbfcd22db\") " Mar 09 13:45:26 crc kubenswrapper[4703]: I0309 13:45:26.970272 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadbd6bd-930e-473a-851e-820fbfcd22db-kube-api-access-xfcrc" (OuterVolumeSpecName: "kube-api-access-xfcrc") pod "aadbd6bd-930e-473a-851e-820fbfcd22db" (UID: "aadbd6bd-930e-473a-851e-820fbfcd22db"). InnerVolumeSpecName "kube-api-access-xfcrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.069104 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfcrc\" (UniqueName: \"kubernetes.io/projected/aadbd6bd-930e-473a-851e-820fbfcd22db-kube-api-access-xfcrc\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.204528 4703 scope.go:117] "RemoveContainer" containerID="d7b39f5cef1fc23c2e91878c4692c968d24bdb7697cd879ccc1febd9f9cad2aa" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.227668 4703 scope.go:117] "RemoveContainer" containerID="019eaed7210329cbeff0c2e64196ddbc02792a17db10f821e0b1e2fb95bcd462" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.247347 4703 scope.go:117] "RemoveContainer" containerID="3e4cf567a5837002d2c0141ba0d9042475b10e8811d43c9947560db2df1b5935" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.266424 4703 scope.go:117] "RemoveContainer" containerID="aa10110b3158b4476c3eae66a3096c4fbd3eeff75ac04dbf4d268f617988039c" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.305654 4703 scope.go:117] "RemoveContainer" containerID="f2c94226eaaa88ffc320263cbb827a94e454d6ae3a3674526fe9a6165cc33c4b" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.345818 4703 scope.go:117] "RemoveContainer" containerID="e385bf13f192160720fbf313777e508e74c6f8f981a405b01cd620e2c5854113" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.373967 4703 scope.go:117] "RemoveContainer" containerID="3ede56d7dcf6cefa860b43906029fcb5c3f1f6cb99775abcedf9d911b05fe2d0" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.400580 4703 scope.go:117] "RemoveContainer" containerID="accc86158e1720e2f0e11414c1e0992a7c4e7b8a90e35f318da65c64cb8ae708" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.420742 4703 scope.go:117] "RemoveContainer" containerID="dd8b7bfcfb6ade8ba3bac6acd1811eafabf777b2ddb95e45644c6329c6651a5f" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.443953 4703 scope.go:117] "RemoveContainer" containerID="ca476d16536191b77b407c1593f9e7ea379047a19311312edce7d6f3805e8311" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.485937 4703 scope.go:117] "RemoveContainer" containerID="25a3def67eb3dcf50b633a42ddc5bb3f82877c01845ab257ad0e0ef4d1188df7" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.515726 4703 scope.go:117] "RemoveContainer" containerID="0774a838ced41224ad9f3a8b9ace73875f3f26aa3d54b268fb8054f0d2176f97" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.540118 4703 scope.go:117] "RemoveContainer" containerID="85faee2f4db6f602d51c14b0f8c52833092bd9ea8e7598c2f93a7c4e35b72d11" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.570361 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5"] Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.570594 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" podUID="5be038e1-1768-4ebe-9ec8-d86cb2173e75" containerName="manager" containerID="cri-o://3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4" gracePeriod=10 Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.570722 4703 scope.go:117] "RemoveContainer" containerID="babfd5640e89611c3f0808f3bdc73a9f43dcff78e059456e126a35336c10a9b4" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.591828 4703 scope.go:117] "RemoveContainer" containerID="4b96991b9e8c867a4213db405e1e2d71eb4a3587e1b31a33fbe86bd6fd21595d" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.621897 4703 scope.go:117] "RemoveContainer" containerID="8fab6a2da38a694c4e883d9ae79143e9450498160afbad5228424f3c025fbd8c" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.642993 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-tzq9d" event={"ID":"aadbd6bd-930e-473a-851e-820fbfcd22db","Type":"ContainerDied","Data":"bb3c73b04e101c39ae86290cd4a10eb8598f1dcfec26fd98914bdfafb46b5c62"} Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.643069 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-tzq9d" Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.669298 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-tzq9d"] Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.673995 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-tzq9d"] Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.853811 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-4m97d"] Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.854090 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-4m97d" podUID="501adaf2-d65b-4e4f-97f5-aefac5df2d6f" containerName="registry-server" containerID="cri-o://2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812" gracePeriod=30 Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.887209 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6"] Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.894113 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/b8fc154d892d0d2aa94a34c8600d1a0cab4cdca8abc09f645bfd1f1da6dkxf6"] Mar 09 13:45:27 crc kubenswrapper[4703]: I0309 13:45:27.975596 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.082524 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-apiservice-cert\") pod \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.082669 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbnbl\" (UniqueName: \"kubernetes.io/projected/5be038e1-1768-4ebe-9ec8-d86cb2173e75-kube-api-access-pbnbl\") pod \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.082700 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-webhook-cert\") pod \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\" (UID: \"5be038e1-1768-4ebe-9ec8-d86cb2173e75\") " Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.088587 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "5be038e1-1768-4ebe-9ec8-d86cb2173e75" (UID: "5be038e1-1768-4ebe-9ec8-d86cb2173e75"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.088752 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "5be038e1-1768-4ebe-9ec8-d86cb2173e75" (UID: "5be038e1-1768-4ebe-9ec8-d86cb2173e75"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.090970 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be038e1-1768-4ebe-9ec8-d86cb2173e75-kube-api-access-pbnbl" (OuterVolumeSpecName: "kube-api-access-pbnbl") pod "5be038e1-1768-4ebe-9ec8-d86cb2173e75" (UID: "5be038e1-1768-4ebe-9ec8-d86cb2173e75"). InnerVolumeSpecName "kube-api-access-pbnbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.184511 4703 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.184554 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbnbl\" (UniqueName: \"kubernetes.io/projected/5be038e1-1768-4ebe-9ec8-d86cb2173e75-kube-api-access-pbnbl\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.184570 4703 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5be038e1-1768-4ebe-9ec8-d86cb2173e75-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.211926 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.385797 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7gk8\" (UniqueName: \"kubernetes.io/projected/501adaf2-d65b-4e4f-97f5-aefac5df2d6f-kube-api-access-c7gk8\") pod \"501adaf2-d65b-4e4f-97f5-aefac5df2d6f\" (UID: \"501adaf2-d65b-4e4f-97f5-aefac5df2d6f\") " Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.388576 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501adaf2-d65b-4e4f-97f5-aefac5df2d6f-kube-api-access-c7gk8" (OuterVolumeSpecName: "kube-api-access-c7gk8") pod "501adaf2-d65b-4e4f-97f5-aefac5df2d6f" (UID: "501adaf2-d65b-4e4f-97f5-aefac5df2d6f"). InnerVolumeSpecName "kube-api-access-c7gk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.487932 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7gk8\" (UniqueName: \"kubernetes.io/projected/501adaf2-d65b-4e4f-97f5-aefac5df2d6f-kube-api-access-c7gk8\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.656503 4703 generic.go:334] "Generic (PLEG): container finished" podID="5be038e1-1768-4ebe-9ec8-d86cb2173e75" containerID="3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4" exitCode=0 Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.656604 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.656634 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" event={"ID":"5be038e1-1768-4ebe-9ec8-d86cb2173e75","Type":"ContainerDied","Data":"3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4"} Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.656724 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5" event={"ID":"5be038e1-1768-4ebe-9ec8-d86cb2173e75","Type":"ContainerDied","Data":"e8ec7f71044982404f741e6ab6bddceba3fb2adb195983ac1c73342fce126da8"} Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.656802 4703 scope.go:117] "RemoveContainer" containerID="3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.660435 4703 generic.go:334] "Generic (PLEG): container finished" podID="501adaf2-d65b-4e4f-97f5-aefac5df2d6f" containerID="2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812" exitCode=0 Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.660481 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4m97d" event={"ID":"501adaf2-d65b-4e4f-97f5-aefac5df2d6f","Type":"ContainerDied","Data":"2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812"} Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.660511 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-4m97d" event={"ID":"501adaf2-d65b-4e4f-97f5-aefac5df2d6f","Type":"ContainerDied","Data":"e0227cc3d9a0cdd126e3c8f281a92e8a3f2bcc39e02d920db8ac5d90d026ece5"} Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.660584 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-4m97d" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.675429 4703 scope.go:117] "RemoveContainer" containerID="3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.676124 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4\": container with ID starting with 3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4 not found: ID does not exist" containerID="3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.676189 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4"} err="failed to get container status \"3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4\": rpc error: code = NotFound desc = could not find container \"3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4\": container with ID starting with 3a0736a0f35a2b956da2c9dcb4b2acef881ae202604567408b11f9beb84019c4 not found: ID does not exist" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.676227 4703 scope.go:117] "RemoveContainer" containerID="2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.695166 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-4m97d"] Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.706431 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-4m97d"] Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.710021 4703 scope.go:117] "RemoveContainer" containerID="2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.710530 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812\": container with ID starting with 2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812 not found: ID does not exist" containerID="2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.710565 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812"} err="failed to get container status \"2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812\": rpc error: code = NotFound desc = could not find container \"2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812\": container with ID starting with 2873c0f36a60a00efbd2fafdc6d961f427ff278d1cd38fb45f4cd7072ff4f812 not found: ID does not exist" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.715796 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501adaf2-d65b-4e4f-97f5-aefac5df2d6f" path="/var/lib/kubelet/pods/501adaf2-d65b-4e4f-97f5-aefac5df2d6f/volumes" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.716709 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67047f4d-9af8-4bbe-841e-a0f0260951d1" path="/var/lib/kubelet/pods/67047f4d-9af8-4bbe-841e-a0f0260951d1/volumes" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.717686 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadbd6bd-930e-473a-851e-820fbfcd22db" path="/var/lib/kubelet/pods/aadbd6bd-930e-473a-851e-820fbfcd22db/volumes" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.719061 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ec581a-8cf6-40d6-abfa-347b39c624c2" path="/var/lib/kubelet/pods/b3ec581a-8cf6-40d6-abfa-347b39c624c2/volumes" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.719547 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5"] Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.719668 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-94c4fbb79-v7dq5"] Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.992623 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grcqc"] Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.992903 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be038e1-1768-4ebe-9ec8-d86cb2173e75" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.992919 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be038e1-1768-4ebe-9ec8-d86cb2173e75" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.992937 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501adaf2-d65b-4e4f-97f5-aefac5df2d6f" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.992945 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="501adaf2-d65b-4e4f-97f5-aefac5df2d6f" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.992965 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ec581a-8cf6-40d6-abfa-347b39c624c2" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.992974 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ec581a-8cf6-40d6-abfa-347b39c624c2" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.992991 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895726c9-a46b-4503-b195-ef668833c34f" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993000 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="895726c9-a46b-4503-b195-ef668833c34f" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.993013 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" containerName="ceph" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993021 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" containerName="ceph" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.993036 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2ff969-dbb6-4152-88e4-bab9ce6ff722" containerName="collect-profiles" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993044 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2ff969-dbb6-4152-88e4-bab9ce6ff722" containerName="collect-profiles" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.993055 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54568517-643e-4d68-bb0b-daf6562bb23d" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993063 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="54568517-643e-4d68-bb0b-daf6562bb23d" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.993079 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadbd6bd-930e-473a-851e-820fbfcd22db" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993088 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadbd6bd-930e-473a-851e-820fbfcd22db" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.993102 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993110 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: E0309 13:45:28.993122 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c690e1-16e7-4418-b82c-a846f6de3430" containerName="operator" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993130 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c690e1-16e7-4418-b82c-a846f6de3430" containerName="operator" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993248 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadbd6bd-930e-473a-851e-820fbfcd22db" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993260 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="895726c9-a46b-4503-b195-ef668833c34f" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993272 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be038e1-1768-4ebe-9ec8-d86cb2173e75" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993286 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2ff969-dbb6-4152-88e4-bab9ce6ff722" containerName="collect-profiles" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993307 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac75edd0-0cc7-42c5-b1c1-6afe38f7adef" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993319 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="54568517-643e-4d68-bb0b-daf6562bb23d" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993329 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c690e1-16e7-4418-b82c-a846f6de3430" containerName="operator" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993341 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e8c250-7292-47d9-92f6-6ec4b4a1c6e3" containerName="mariadb-account-delete" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993354 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ec184b-eb8e-4967-b8ec-17cb6f984ccb" containerName="ceph" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993365 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ec581a-8cf6-40d6-abfa-347b39c624c2" containerName="manager" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.993376 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="501adaf2-d65b-4e4f-97f5-aefac5df2d6f" containerName="registry-server" Mar 09 13:45:28 crc kubenswrapper[4703]: I0309 13:45:28.994310 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.012092 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grcqc"] Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.095413 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-utilities\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.095483 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-catalog-content\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.095518 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dsd\" (UniqueName: \"kubernetes.io/projected/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-kube-api-access-p4dsd\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.197014 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-catalog-content\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.197073 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dsd\" (UniqueName: \"kubernetes.io/projected/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-kube-api-access-p4dsd\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.197121 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-utilities\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.197574 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-utilities\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.197695 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-catalog-content\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.214543 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dsd\" (UniqueName: \"kubernetes.io/projected/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-kube-api-access-p4dsd\") pod \"redhat-operators-grcqc\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.324562 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:29 crc kubenswrapper[4703]: I0309 13:45:29.744404 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grcqc"] Mar 09 13:45:30 crc kubenswrapper[4703]: I0309 13:45:30.678019 4703 generic.go:334] "Generic (PLEG): container finished" podID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerID="6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e" exitCode=0 Mar 09 13:45:30 crc kubenswrapper[4703]: I0309 13:45:30.678095 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grcqc" event={"ID":"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a","Type":"ContainerDied","Data":"6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e"} Mar 09 13:45:30 crc kubenswrapper[4703]: I0309 13:45:30.678350 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grcqc" event={"ID":"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a","Type":"ContainerStarted","Data":"8e04f0dea642a6dfc112984b6d209a6948bade6ffbda78ca98c23c7166a51f81"} Mar 09 13:45:30 crc kubenswrapper[4703]: I0309 13:45:30.680452 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:45:30 crc kubenswrapper[4703]: I0309 13:45:30.714723 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be038e1-1768-4ebe-9ec8-d86cb2173e75" path="/var/lib/kubelet/pods/5be038e1-1768-4ebe-9ec8-d86cb2173e75/volumes" Mar 09 13:45:31 crc kubenswrapper[4703]: I0309 13:45:31.685514 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grcqc" event={"ID":"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a","Type":"ContainerStarted","Data":"0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06"} Mar 09 13:45:32 crc kubenswrapper[4703]: I0309 13:45:32.695391 4703 generic.go:334] "Generic (PLEG): container finished" podID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerID="0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06" exitCode=0 Mar 09 13:45:32 crc kubenswrapper[4703]: I0309 13:45:32.695435 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grcqc" event={"ID":"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a","Type":"ContainerDied","Data":"0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06"} Mar 09 13:45:33 crc kubenswrapper[4703]: I0309 13:45:33.706511 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grcqc" event={"ID":"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a","Type":"ContainerStarted","Data":"3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95"} Mar 09 13:45:39 crc kubenswrapper[4703]: I0309 13:45:39.325533 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:39 crc kubenswrapper[4703]: I0309 13:45:39.326072 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:39 crc kubenswrapper[4703]: I0309 13:45:39.500344 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:45:39 crc kubenswrapper[4703]: I0309 13:45:39.500403 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:45:40 crc kubenswrapper[4703]: I0309 13:45:40.386446 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grcqc" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="registry-server" probeResult="failure" output=< Mar 09 13:45:40 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:45:40 crc kubenswrapper[4703]: > Mar 09 13:45:49 crc kubenswrapper[4703]: I0309 13:45:49.388162 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:49 crc kubenswrapper[4703]: I0309 13:45:49.409894 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grcqc" podStartSLOduration=18.971685099 podStartE2EDuration="21.409875205s" podCreationTimestamp="2026-03-09 13:45:28 +0000 UTC" firstStartedPulling="2026-03-09 13:45:30.680114883 +0000 UTC m=+1526.647530569" lastFinishedPulling="2026-03-09 13:45:33.118304959 +0000 UTC m=+1529.085720675" observedRunningTime="2026-03-09 13:45:33.736480442 +0000 UTC m=+1529.703896168" watchObservedRunningTime="2026-03-09 13:45:49.409875205 +0000 UTC m=+1545.377290911" Mar 09 13:45:49 crc kubenswrapper[4703]: I0309 13:45:49.446553 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:49 crc kubenswrapper[4703]: I0309 13:45:49.625203 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grcqc"] Mar 09 13:45:50 crc kubenswrapper[4703]: I0309 13:45:50.839290 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grcqc" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="registry-server" containerID="cri-o://3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95" gracePeriod=2 Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.243519 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p66pf/must-gather-hlwpf"] Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.246127 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.248536 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p66pf"/"openshift-service-ca.crt" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.248760 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p66pf"/"kube-root-ca.crt" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.263012 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p66pf/must-gather-hlwpf"] Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.322997 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.414756 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-utilities\") pod \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.414807 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-catalog-content\") pod \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.414972 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4dsd\" (UniqueName: \"kubernetes.io/projected/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-kube-api-access-p4dsd\") pod \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\" (UID: \"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a\") " Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.415181 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2z9s\" (UniqueName: \"kubernetes.io/projected/98a41f6a-d624-41bc-ac11-cf49a7e96960-kube-api-access-z2z9s\") pod \"must-gather-hlwpf\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.415249 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98a41f6a-d624-41bc-ac11-cf49a7e96960-must-gather-output\") pod \"must-gather-hlwpf\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.415757 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-utilities" (OuterVolumeSpecName: "utilities") pod "7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" (UID: "7cc9c797-b31f-42d1-8235-0bd3cde7ff3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.430125 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-kube-api-access-p4dsd" (OuterVolumeSpecName: "kube-api-access-p4dsd") pod "7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" (UID: "7cc9c797-b31f-42d1-8235-0bd3cde7ff3a"). InnerVolumeSpecName "kube-api-access-p4dsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.516643 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98a41f6a-d624-41bc-ac11-cf49a7e96960-must-gather-output\") pod \"must-gather-hlwpf\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.516758 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2z9s\" (UniqueName: \"kubernetes.io/projected/98a41f6a-d624-41bc-ac11-cf49a7e96960-kube-api-access-z2z9s\") pod \"must-gather-hlwpf\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.516824 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.516859 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4dsd\" (UniqueName: \"kubernetes.io/projected/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-kube-api-access-p4dsd\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.517291 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98a41f6a-d624-41bc-ac11-cf49a7e96960-must-gather-output\") pod \"must-gather-hlwpf\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.561745 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" (UID: "7cc9c797-b31f-42d1-8235-0bd3cde7ff3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.566726 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2z9s\" (UniqueName: \"kubernetes.io/projected/98a41f6a-d624-41bc-ac11-cf49a7e96960-kube-api-access-z2z9s\") pod \"must-gather-hlwpf\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.617756 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.849910 4703 generic.go:334] "Generic (PLEG): container finished" podID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerID="3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95" exitCode=0 Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.849953 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grcqc" event={"ID":"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a","Type":"ContainerDied","Data":"3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95"} Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.849981 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grcqc" event={"ID":"7cc9c797-b31f-42d1-8235-0bd3cde7ff3a","Type":"ContainerDied","Data":"8e04f0dea642a6dfc112984b6d209a6948bade6ffbda78ca98c23c7166a51f81"} Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.849997 4703 scope.go:117] "RemoveContainer" containerID="3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.850075 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grcqc" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.860412 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.867638 4703 scope.go:117] "RemoveContainer" containerID="0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.892547 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grcqc"] Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.896993 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grcqc"] Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.903221 4703 scope.go:117] "RemoveContainer" containerID="6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.924145 4703 scope.go:117] "RemoveContainer" containerID="3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95" Mar 09 13:45:51 crc kubenswrapper[4703]: E0309 13:45:51.924646 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95\": container with ID starting with 3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95 not found: ID does not exist" containerID="3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.924680 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95"} err="failed to get container status \"3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95\": rpc error: code = NotFound desc = could not find container \"3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95\": container with ID starting with 3a25818d116a8294bef42c2c0ec79871701c182aa08ddaec92596508cb761e95 not found: ID does not exist" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.924718 4703 scope.go:117] "RemoveContainer" containerID="0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06" Mar 09 13:45:51 crc kubenswrapper[4703]: E0309 13:45:51.925074 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06\": container with ID starting with 0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06 not found: ID does not exist" containerID="0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.925102 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06"} err="failed to get container status \"0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06\": rpc error: code = NotFound desc = could not find container \"0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06\": container with ID starting with 0726b133f29d7bee5b0e9fa08c854e6f0c10385c09e161014ae97ca259571d06 not found: ID does not exist" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.925175 4703 scope.go:117] "RemoveContainer" containerID="6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e" Mar 09 13:45:51 crc kubenswrapper[4703]: E0309 13:45:51.925858 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e\": container with ID starting with 6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e not found: ID does not exist" containerID="6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e" Mar 09 13:45:51 crc kubenswrapper[4703]: I0309 13:45:51.925910 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e"} err="failed to get container status \"6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e\": rpc error: code = NotFound desc = could not find container \"6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e\": container with ID starting with 6ccde9f8af0ac752b374ac2aef71a18dbc7ad1c777aca7355675b559a306cd4e not found: ID does not exist" Mar 09 13:45:52 crc kubenswrapper[4703]: I0309 13:45:52.298021 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p66pf/must-gather-hlwpf"] Mar 09 13:45:52 crc kubenswrapper[4703]: W0309 13:45:52.312116 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98a41f6a_d624_41bc_ac11_cf49a7e96960.slice/crio-64491acf2e232097ad29e89ee289f6be2e99f8c1553fac063d12c81591525236 WatchSource:0}: Error finding container 64491acf2e232097ad29e89ee289f6be2e99f8c1553fac063d12c81591525236: Status 404 returned error can't find the container with id 64491acf2e232097ad29e89ee289f6be2e99f8c1553fac063d12c81591525236 Mar 09 13:45:52 crc kubenswrapper[4703]: I0309 13:45:52.717769 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" path="/var/lib/kubelet/pods/7cc9c797-b31f-42d1-8235-0bd3cde7ff3a/volumes" Mar 09 13:45:52 crc kubenswrapper[4703]: I0309 13:45:52.858817 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p66pf/must-gather-hlwpf" event={"ID":"98a41f6a-d624-41bc-ac11-cf49a7e96960","Type":"ContainerStarted","Data":"64491acf2e232097ad29e89ee289f6be2e99f8c1553fac063d12c81591525236"} Mar 09 13:45:58 crc kubenswrapper[4703]: I0309 13:45:58.899426 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p66pf/must-gather-hlwpf" event={"ID":"98a41f6a-d624-41bc-ac11-cf49a7e96960","Type":"ContainerStarted","Data":"7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08"} Mar 09 13:45:58 crc kubenswrapper[4703]: I0309 13:45:58.899801 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p66pf/must-gather-hlwpf" event={"ID":"98a41f6a-d624-41bc-ac11-cf49a7e96960","Type":"ContainerStarted","Data":"29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95"} Mar 09 13:45:58 crc kubenswrapper[4703]: I0309 13:45:58.952466 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p66pf/must-gather-hlwpf" podStartSLOduration=1.863823554 podStartE2EDuration="7.952436838s" podCreationTimestamp="2026-03-09 13:45:51 +0000 UTC" firstStartedPulling="2026-03-09 13:45:52.315499992 +0000 UTC m=+1548.282915678" lastFinishedPulling="2026-03-09 13:45:58.404113286 +0000 UTC m=+1554.371528962" observedRunningTime="2026-03-09 13:45:58.947406683 +0000 UTC m=+1554.914822369" watchObservedRunningTime="2026-03-09 13:45:58.952436838 +0000 UTC m=+1554.919852564" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.132199 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551066-799x4"] Mar 09 13:46:00 crc kubenswrapper[4703]: E0309 13:46:00.132535 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="registry-server" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.132553 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="registry-server" Mar 09 13:46:00 crc kubenswrapper[4703]: E0309 13:46:00.132574 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="extract-content" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.132580 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="extract-content" Mar 09 13:46:00 crc kubenswrapper[4703]: E0309 13:46:00.132587 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="extract-utilities" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.132596 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="extract-utilities" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.132728 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc9c797-b31f-42d1-8235-0bd3cde7ff3a" containerName="registry-server" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.133306 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-799x4" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.139549 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.139649 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.139688 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.151701 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-799x4"] Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.222741 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5fvh\" (UniqueName: \"kubernetes.io/projected/8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1-kube-api-access-f5fvh\") pod \"auto-csr-approver-29551066-799x4\" (UID: \"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1\") " pod="openshift-infra/auto-csr-approver-29551066-799x4" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.323658 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5fvh\" (UniqueName: \"kubernetes.io/projected/8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1-kube-api-access-f5fvh\") pod \"auto-csr-approver-29551066-799x4\" (UID: \"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1\") " pod="openshift-infra/auto-csr-approver-29551066-799x4" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.349755 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5fvh\" (UniqueName: \"kubernetes.io/projected/8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1-kube-api-access-f5fvh\") pod \"auto-csr-approver-29551066-799x4\" (UID: \"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1\") " pod="openshift-infra/auto-csr-approver-29551066-799x4" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.454902 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-799x4" Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.683738 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-799x4"] Mar 09 13:46:00 crc kubenswrapper[4703]: I0309 13:46:00.914176 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-799x4" event={"ID":"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1","Type":"ContainerStarted","Data":"c9063f129a6123bdda7ac46a981b9f6bf049676cc38be7e2a3f4a8df2b9f7724"} Mar 09 13:46:01 crc kubenswrapper[4703]: I0309 13:46:01.921119 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-799x4" event={"ID":"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1","Type":"ContainerStarted","Data":"3cd07f5da59dfaabbaa13949c32ac067d296ea3a6a0934fd5fcc5f70670c8ede"} Mar 09 13:46:01 crc kubenswrapper[4703]: I0309 13:46:01.933660 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551066-799x4" podStartSLOduration=1.010603704 podStartE2EDuration="1.933632747s" podCreationTimestamp="2026-03-09 13:46:00 +0000 UTC" firstStartedPulling="2026-03-09 13:46:00.689183367 +0000 UTC m=+1556.656599053" lastFinishedPulling="2026-03-09 13:46:01.61221239 +0000 UTC m=+1557.579628096" observedRunningTime="2026-03-09 13:46:01.932649759 +0000 UTC m=+1557.900065445" watchObservedRunningTime="2026-03-09 13:46:01.933632747 +0000 UTC m=+1557.901048433" Mar 09 13:46:02 crc kubenswrapper[4703]: I0309 13:46:02.928951 4703 generic.go:334] "Generic (PLEG): container finished" podID="8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1" containerID="3cd07f5da59dfaabbaa13949c32ac067d296ea3a6a0934fd5fcc5f70670c8ede" exitCode=0 Mar 09 13:46:02 crc kubenswrapper[4703]: I0309 13:46:02.929043 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-799x4" event={"ID":"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1","Type":"ContainerDied","Data":"3cd07f5da59dfaabbaa13949c32ac067d296ea3a6a0934fd5fcc5f70670c8ede"} Mar 09 13:46:04 crc kubenswrapper[4703]: I0309 13:46:04.262980 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-799x4" Mar 09 13:46:04 crc kubenswrapper[4703]: I0309 13:46:04.375230 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5fvh\" (UniqueName: \"kubernetes.io/projected/8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1-kube-api-access-f5fvh\") pod \"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1\" (UID: \"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1\") " Mar 09 13:46:04 crc kubenswrapper[4703]: I0309 13:46:04.380932 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1-kube-api-access-f5fvh" (OuterVolumeSpecName: "kube-api-access-f5fvh") pod "8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1" (UID: "8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1"). InnerVolumeSpecName "kube-api-access-f5fvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:46:04 crc kubenswrapper[4703]: I0309 13:46:04.476906 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5fvh\" (UniqueName: \"kubernetes.io/projected/8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1-kube-api-access-f5fvh\") on node \"crc\" DevicePath \"\"" Mar 09 13:46:04 crc kubenswrapper[4703]: I0309 13:46:04.947305 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551066-799x4" event={"ID":"8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1","Type":"ContainerDied","Data":"c9063f129a6123bdda7ac46a981b9f6bf049676cc38be7e2a3f4a8df2b9f7724"} Mar 09 13:46:04 crc kubenswrapper[4703]: I0309 13:46:04.947364 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9063f129a6123bdda7ac46a981b9f6bf049676cc38be7e2a3f4a8df2b9f7724" Mar 09 13:46:04 crc kubenswrapper[4703]: I0309 13:46:04.947392 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551066-799x4" Mar 09 13:46:05 crc kubenswrapper[4703]: I0309 13:46:05.010025 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-27x8b"] Mar 09 13:46:05 crc kubenswrapper[4703]: I0309 13:46:05.014468 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551060-27x8b"] Mar 09 13:46:06 crc kubenswrapper[4703]: I0309 13:46:06.714138 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86115e2b-93f7-439b-99e8-59baa25f2491" path="/var/lib/kubelet/pods/86115e2b-93f7-439b-99e8-59baa25f2491/volumes" Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.500415 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.501506 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.501664 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.502294 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.502432 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" gracePeriod=600 Mar 09 13:46:09 crc kubenswrapper[4703]: E0309 13:46:09.633529 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.990791 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" exitCode=0 Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.990903 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f"} Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.991645 4703 scope.go:117] "RemoveContainer" containerID="63593ee12ca38b373f82a991b81f46458f13b7e7d0525d9d0305c5a08fc34572" Mar 09 13:46:09 crc kubenswrapper[4703]: I0309 13:46:09.992306 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:46:09 crc kubenswrapper[4703]: E0309 13:46:09.992765 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:46:22 crc kubenswrapper[4703]: I0309 13:46:22.707207 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:46:22 crc kubenswrapper[4703]: E0309 13:46:22.707816 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:46:28 crc kubenswrapper[4703]: I0309 13:46:28.081349 4703 scope.go:117] "RemoveContainer" containerID="e82fc16aa929fec22073aa7b689502f0c09d7dd0a24661be28cb43d88ebc858d" Mar 09 13:46:28 crc kubenswrapper[4703]: I0309 13:46:28.115242 4703 scope.go:117] "RemoveContainer" containerID="46be58143c5655ade22fd8aac3f086bdb5cb8d1805cb0bb507632272e93f947a" Mar 09 13:46:28 crc kubenswrapper[4703]: I0309 13:46:28.175715 4703 scope.go:117] "RemoveContainer" containerID="74867807e231a988cf8ff5f2097f91b1ae6c1cd31322635c4495d834a3894291" Mar 09 13:46:28 crc kubenswrapper[4703]: I0309 13:46:28.202220 4703 scope.go:117] "RemoveContainer" containerID="fef669e8097a182022f24fc2b549ca6651eb824d7f6d6c46f0609b95efaa85ce" Mar 09 13:46:28 crc kubenswrapper[4703]: I0309 13:46:28.224452 4703 scope.go:117] "RemoveContainer" containerID="e343ed8ae902d506b5f08c379658c23e46e43d4c2e7f091d3685a50da8ed24af" Mar 09 13:46:28 crc kubenswrapper[4703]: I0309 13:46:28.245568 4703 scope.go:117] "RemoveContainer" containerID="a787fc6c4a4228f73efd5b310736629312dbe98466b00f7e398cc6bfae44b308" Mar 09 13:46:28 crc kubenswrapper[4703]: I0309 13:46:28.267354 4703 scope.go:117] "RemoveContainer" containerID="2aa1b5408cefe64fe06380dbac60b17f9050ade6d25f89496f034660467007a8" Mar 09 13:46:37 crc kubenswrapper[4703]: I0309 13:46:37.707406 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:46:37 crc kubenswrapper[4703]: E0309 13:46:37.708654 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:46:49 crc kubenswrapper[4703]: I0309 13:46:49.707326 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:46:49 crc kubenswrapper[4703]: E0309 13:46:49.708243 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:46:50 crc kubenswrapper[4703]: I0309 13:46:50.005159 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kzdxt_9b14d2ef-2e68-4bb5-be73-2bbbb837c463/control-plane-machine-set-operator/0.log" Mar 09 13:46:50 crc kubenswrapper[4703]: I0309 13:46:50.138513 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6qv87_7e8298d3-3e49-4df3-9369-2623b11981cf/kube-rbac-proxy/0.log" Mar 09 13:46:50 crc kubenswrapper[4703]: I0309 13:46:50.163256 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6qv87_7e8298d3-3e49-4df3-9369-2623b11981cf/machine-api-operator/0.log" Mar 09 13:47:00 crc kubenswrapper[4703]: I0309 13:47:00.707735 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:47:00 crc kubenswrapper[4703]: E0309 13:47:00.708436 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:47:11 crc kubenswrapper[4703]: I0309 13:47:11.707424 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:47:11 crc kubenswrapper[4703]: E0309 13:47:11.708444 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.457371 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-htlzv_1b3ec15b-631d-4ea5-b1f2-899a5d44785d/kube-rbac-proxy/0.log" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.518821 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-htlzv_1b3ec15b-631d-4ea5-b1f2-899a5d44785d/controller/0.log" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.646262 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-mqpjj_7941f29f-2aed-45e8-b9c3-d4e0c19573ee/frr-k8s-webhook-server/0.log" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.701479 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.850795 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.853565 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.862084 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:47:18 crc kubenswrapper[4703]: I0309 13:47:18.891807 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.108134 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.112166 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.112233 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.135343 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.306387 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.315321 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.317190 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.345040 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/controller/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.468457 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/kube-rbac-proxy/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.490555 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/frr-metrics/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.545230 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/kube-rbac-proxy-frr/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.630468 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/reloader/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.824733 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6c588f5d89-l7mpr_cad59883-2357-4002-a757-689c894f9c33/manager/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.934161 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/frr/0.log" Mar 09 13:47:19 crc kubenswrapper[4703]: I0309 13:47:19.956124 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5bcdd89498-pjlcz_f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47/webhook-server/0.log" Mar 09 13:47:20 crc kubenswrapper[4703]: I0309 13:47:20.036260 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qlmqf_a7841ae2-e705-49ff-a7f9-83fc45d05454/kube-rbac-proxy/0.log" Mar 09 13:47:20 crc kubenswrapper[4703]: I0309 13:47:20.235453 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qlmqf_a7841ae2-e705-49ff-a7f9-83fc45d05454/speaker/0.log" Mar 09 13:47:22 crc kubenswrapper[4703]: I0309 13:47:22.707569 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:47:22 crc kubenswrapper[4703]: E0309 13:47:22.707782 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:47:28 crc kubenswrapper[4703]: I0309 13:47:28.336819 4703 scope.go:117] "RemoveContainer" containerID="fa8ff7679c48da18ad72979fd0bd9429bb61cf1f55c33bb431306d56f86b0804" Mar 09 13:47:28 crc kubenswrapper[4703]: I0309 13:47:28.356174 4703 scope.go:117] "RemoveContainer" containerID="6d189dd2bdf061adcbabd7355f537a98c02269eeafc420de3f83bacbf27fd7b9" Mar 09 13:47:28 crc kubenswrapper[4703]: I0309 13:47:28.382356 4703 scope.go:117] "RemoveContainer" containerID="b0ede2a6d6f11ad35470934f54c977630b49b1951fbf2371fbed23fe3d896274" Mar 09 13:47:33 crc kubenswrapper[4703]: I0309 13:47:33.707553 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:47:33 crc kubenswrapper[4703]: E0309 13:47:33.708546 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:47:46 crc kubenswrapper[4703]: I0309 13:47:46.647293 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-utilities/0.log" Mar 09 13:47:46 crc kubenswrapper[4703]: I0309 13:47:46.707138 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:47:46 crc kubenswrapper[4703]: E0309 13:47:46.707633 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:47:46 crc kubenswrapper[4703]: I0309 13:47:46.818615 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-content/0.log" Mar 09 13:47:46 crc kubenswrapper[4703]: I0309 13:47:46.842384 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-utilities/0.log" Mar 09 13:47:46 crc kubenswrapper[4703]: I0309 13:47:46.857493 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-content/0.log" Mar 09 13:47:46 crc kubenswrapper[4703]: I0309 13:47:46.989306 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-utilities/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.035375 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-content/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.242451 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-utilities/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.317898 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/registry-server/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.342743 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-utilities/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.366626 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-content/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.419393 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-content/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.547833 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-content/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.576296 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-utilities/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.725336 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/util/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.929972 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/util/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.930800 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/pull/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.960662 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/registry-server/0.log" Mar 09 13:47:47 crc kubenswrapper[4703]: I0309 13:47:47.963537 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/pull/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.084182 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/util/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.094594 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/pull/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.104777 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/extract/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.248080 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q7bwg_73560a56-06ae-4ac9-91b9-4fe8478082e7/marketplace-operator/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.302475 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-utilities/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.421382 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-utilities/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.425220 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-content/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.450230 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-content/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.600228 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-content/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.623109 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-utilities/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.679647 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/registry-server/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.804641 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-utilities/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.933882 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-utilities/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.948177 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-content/0.log" Mar 09 13:47:48 crc kubenswrapper[4703]: I0309 13:47:48.964064 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-content/0.log" Mar 09 13:47:49 crc kubenswrapper[4703]: I0309 13:47:49.146455 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-utilities/0.log" Mar 09 13:47:49 crc kubenswrapper[4703]: I0309 13:47:49.165440 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-content/0.log" Mar 09 13:47:49 crc kubenswrapper[4703]: I0309 13:47:49.423947 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/registry-server/0.log" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.162986 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551068-krbtc"] Mar 09 13:48:00 crc kubenswrapper[4703]: E0309 13:48:00.164063 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1" containerName="oc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.164086 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1" containerName="oc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.164268 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1" containerName="oc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.165624 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-krbtc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.167683 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-krbtc"] Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.169027 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.169335 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.171110 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.299419 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbpl6\" (UniqueName: \"kubernetes.io/projected/51432a9e-2585-4e57-a434-3ceef691ce43-kube-api-access-lbpl6\") pod \"auto-csr-approver-29551068-krbtc\" (UID: \"51432a9e-2585-4e57-a434-3ceef691ce43\") " pod="openshift-infra/auto-csr-approver-29551068-krbtc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.401193 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbpl6\" (UniqueName: \"kubernetes.io/projected/51432a9e-2585-4e57-a434-3ceef691ce43-kube-api-access-lbpl6\") pod \"auto-csr-approver-29551068-krbtc\" (UID: \"51432a9e-2585-4e57-a434-3ceef691ce43\") " pod="openshift-infra/auto-csr-approver-29551068-krbtc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.439639 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbpl6\" (UniqueName: \"kubernetes.io/projected/51432a9e-2585-4e57-a434-3ceef691ce43-kube-api-access-lbpl6\") pod \"auto-csr-approver-29551068-krbtc\" (UID: \"51432a9e-2585-4e57-a434-3ceef691ce43\") " pod="openshift-infra/auto-csr-approver-29551068-krbtc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.492631 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-krbtc" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.706894 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:48:00 crc kubenswrapper[4703]: E0309 13:48:00.707642 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:48:00 crc kubenswrapper[4703]: I0309 13:48:00.918127 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-krbtc"] Mar 09 13:48:01 crc kubenswrapper[4703]: I0309 13:48:01.743411 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-krbtc" event={"ID":"51432a9e-2585-4e57-a434-3ceef691ce43","Type":"ContainerStarted","Data":"182554da231c38f9cea9febef3efa2097b08e0d331ca1666e8709fbd24a32f62"} Mar 09 13:48:02 crc kubenswrapper[4703]: I0309 13:48:02.752180 4703 generic.go:334] "Generic (PLEG): container finished" podID="51432a9e-2585-4e57-a434-3ceef691ce43" containerID="a981b4dc895454936c60a2c0cd57431ed19eb530423f942402ee12ba66efd649" exitCode=0 Mar 09 13:48:02 crc kubenswrapper[4703]: I0309 13:48:02.753262 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-krbtc" event={"ID":"51432a9e-2585-4e57-a434-3ceef691ce43","Type":"ContainerDied","Data":"a981b4dc895454936c60a2c0cd57431ed19eb530423f942402ee12ba66efd649"} Mar 09 13:48:04 crc kubenswrapper[4703]: I0309 13:48:04.017791 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-krbtc" Mar 09 13:48:04 crc kubenswrapper[4703]: I0309 13:48:04.150817 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbpl6\" (UniqueName: \"kubernetes.io/projected/51432a9e-2585-4e57-a434-3ceef691ce43-kube-api-access-lbpl6\") pod \"51432a9e-2585-4e57-a434-3ceef691ce43\" (UID: \"51432a9e-2585-4e57-a434-3ceef691ce43\") " Mar 09 13:48:04 crc kubenswrapper[4703]: I0309 13:48:04.156479 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51432a9e-2585-4e57-a434-3ceef691ce43-kube-api-access-lbpl6" (OuterVolumeSpecName: "kube-api-access-lbpl6") pod "51432a9e-2585-4e57-a434-3ceef691ce43" (UID: "51432a9e-2585-4e57-a434-3ceef691ce43"). InnerVolumeSpecName "kube-api-access-lbpl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:48:04 crc kubenswrapper[4703]: I0309 13:48:04.252539 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbpl6\" (UniqueName: \"kubernetes.io/projected/51432a9e-2585-4e57-a434-3ceef691ce43-kube-api-access-lbpl6\") on node \"crc\" DevicePath \"\"" Mar 09 13:48:04 crc kubenswrapper[4703]: I0309 13:48:04.763827 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551068-krbtc" event={"ID":"51432a9e-2585-4e57-a434-3ceef691ce43","Type":"ContainerDied","Data":"182554da231c38f9cea9febef3efa2097b08e0d331ca1666e8709fbd24a32f62"} Mar 09 13:48:04 crc kubenswrapper[4703]: I0309 13:48:04.763908 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="182554da231c38f9cea9febef3efa2097b08e0d331ca1666e8709fbd24a32f62" Mar 09 13:48:04 crc kubenswrapper[4703]: I0309 13:48:04.764042 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551068-krbtc" Mar 09 13:48:05 crc kubenswrapper[4703]: I0309 13:48:05.098412 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-lw8cm"] Mar 09 13:48:05 crc kubenswrapper[4703]: I0309 13:48:05.104506 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551062-lw8cm"] Mar 09 13:48:06 crc kubenswrapper[4703]: I0309 13:48:06.719387 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c507239-c234-4c0f-b792-43e2c592106c" path="/var/lib/kubelet/pods/8c507239-c234-4c0f-b792-43e2c592106c/volumes" Mar 09 13:48:12 crc kubenswrapper[4703]: I0309 13:48:12.714507 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:48:12 crc kubenswrapper[4703]: E0309 13:48:12.715173 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:48:27 crc kubenswrapper[4703]: I0309 13:48:27.706910 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:48:27 crc kubenswrapper[4703]: E0309 13:48:27.707884 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:48:28 crc kubenswrapper[4703]: I0309 13:48:28.478324 4703 scope.go:117] "RemoveContainer" containerID="fc153240335b020ba4fe47c249a70fd899fc69a21c7c6187417de0fea757e6b8" Mar 09 13:48:41 crc kubenswrapper[4703]: I0309 13:48:41.707307 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:48:41 crc kubenswrapper[4703]: E0309 13:48:41.710450 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:48:54 crc kubenswrapper[4703]: I0309 13:48:54.711009 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:48:54 crc kubenswrapper[4703]: E0309 13:48:54.713312 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:49:01 crc kubenswrapper[4703]: I0309 13:49:01.249101 4703 generic.go:334] "Generic (PLEG): container finished" podID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerID="29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95" exitCode=0 Mar 09 13:49:01 crc kubenswrapper[4703]: I0309 13:49:01.249236 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p66pf/must-gather-hlwpf" event={"ID":"98a41f6a-d624-41bc-ac11-cf49a7e96960","Type":"ContainerDied","Data":"29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95"} Mar 09 13:49:01 crc kubenswrapper[4703]: I0309 13:49:01.252768 4703 scope.go:117] "RemoveContainer" containerID="29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95" Mar 09 13:49:01 crc kubenswrapper[4703]: I0309 13:49:01.357043 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p66pf_must-gather-hlwpf_98a41f6a-d624-41bc-ac11-cf49a7e96960/gather/0.log" Mar 09 13:49:05 crc kubenswrapper[4703]: I0309 13:49:05.706956 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:49:05 crc kubenswrapper[4703]: E0309 13:49:05.707916 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.378149 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p66pf/must-gather-hlwpf"] Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.378968 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-p66pf/must-gather-hlwpf" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerName="copy" containerID="cri-o://7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08" gracePeriod=2 Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.383341 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p66pf/must-gather-hlwpf"] Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.735435 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p66pf_must-gather-hlwpf_98a41f6a-d624-41bc-ac11-cf49a7e96960/copy/0.log" Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.736764 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.930831 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2z9s\" (UniqueName: \"kubernetes.io/projected/98a41f6a-d624-41bc-ac11-cf49a7e96960-kube-api-access-z2z9s\") pod \"98a41f6a-d624-41bc-ac11-cf49a7e96960\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.930984 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98a41f6a-d624-41bc-ac11-cf49a7e96960-must-gather-output\") pod \"98a41f6a-d624-41bc-ac11-cf49a7e96960\" (UID: \"98a41f6a-d624-41bc-ac11-cf49a7e96960\") " Mar 09 13:49:08 crc kubenswrapper[4703]: I0309 13:49:08.948037 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98a41f6a-d624-41bc-ac11-cf49a7e96960-kube-api-access-z2z9s" (OuterVolumeSpecName: "kube-api-access-z2z9s") pod "98a41f6a-d624-41bc-ac11-cf49a7e96960" (UID: "98a41f6a-d624-41bc-ac11-cf49a7e96960"). InnerVolumeSpecName "kube-api-access-z2z9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.001437 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98a41f6a-d624-41bc-ac11-cf49a7e96960-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "98a41f6a-d624-41bc-ac11-cf49a7e96960" (UID: "98a41f6a-d624-41bc-ac11-cf49a7e96960"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.032232 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2z9s\" (UniqueName: \"kubernetes.io/projected/98a41f6a-d624-41bc-ac11-cf49a7e96960-kube-api-access-z2z9s\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.032261 4703 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98a41f6a-d624-41bc-ac11-cf49a7e96960-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.315876 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p66pf_must-gather-hlwpf_98a41f6a-d624-41bc-ac11-cf49a7e96960/copy/0.log" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.316385 4703 generic.go:334] "Generic (PLEG): container finished" podID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerID="7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08" exitCode=143 Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.316474 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p66pf/must-gather-hlwpf" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.316505 4703 scope.go:117] "RemoveContainer" containerID="7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.341504 4703 scope.go:117] "RemoveContainer" containerID="29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.389176 4703 scope.go:117] "RemoveContainer" containerID="7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08" Mar 09 13:49:09 crc kubenswrapper[4703]: E0309 13:49:09.389713 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08\": container with ID starting with 7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08 not found: ID does not exist" containerID="7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.389752 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08"} err="failed to get container status \"7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08\": rpc error: code = NotFound desc = could not find container \"7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08\": container with ID starting with 7db5ccdbe2c1587ac5c5eef8e19a4c15aab42968b31a816462c11158ea73df08 not found: ID does not exist" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.389784 4703 scope.go:117] "RemoveContainer" containerID="29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95" Mar 09 13:49:09 crc kubenswrapper[4703]: E0309 13:49:09.390952 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95\": container with ID starting with 29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95 not found: ID does not exist" containerID="29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95" Mar 09 13:49:09 crc kubenswrapper[4703]: I0309 13:49:09.390976 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95"} err="failed to get container status \"29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95\": rpc error: code = NotFound desc = could not find container \"29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95\": container with ID starting with 29fc4c9bf4c83cec0520dccea0e52c5a7efc439def903286e64a09031b711e95 not found: ID does not exist" Mar 09 13:49:10 crc kubenswrapper[4703]: I0309 13:49:10.717665 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" path="/var/lib/kubelet/pods/98a41f6a-d624-41bc-ac11-cf49a7e96960/volumes" Mar 09 13:49:16 crc kubenswrapper[4703]: I0309 13:49:16.707702 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:49:16 crc kubenswrapper[4703]: E0309 13:49:16.708211 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:49:27 crc kubenswrapper[4703]: I0309 13:49:27.707212 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:49:27 crc kubenswrapper[4703]: E0309 13:49:27.707922 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:49:28 crc kubenswrapper[4703]: I0309 13:49:28.553155 4703 scope.go:117] "RemoveContainer" containerID="73c722e627e04c8631389202420e69e40ce36adf8ecc45a47499418b2d2b6639" Mar 09 13:49:28 crc kubenswrapper[4703]: I0309 13:49:28.585100 4703 scope.go:117] "RemoveContainer" containerID="ea8c3282c4ddfd90584cc8d00c829b2761551af5a7aacdd845aa84e97f3514ee" Mar 09 13:49:28 crc kubenswrapper[4703]: I0309 13:49:28.617415 4703 scope.go:117] "RemoveContainer" containerID="126677a845daf3c303e8c68d80efda701d45d16d2ebb59c8fc6c289fcfa45845" Mar 09 13:49:28 crc kubenswrapper[4703]: I0309 13:49:28.631434 4703 scope.go:117] "RemoveContainer" containerID="b66fea10bd1ffe1d6b60c1efc601c72dc5945d496166a9d70907cfc93cddabb3" Mar 09 13:49:42 crc kubenswrapper[4703]: I0309 13:49:42.707680 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:49:42 crc kubenswrapper[4703]: E0309 13:49:42.708743 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:49:53 crc kubenswrapper[4703]: I0309 13:49:53.708470 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:49:53 crc kubenswrapper[4703]: E0309 13:49:53.709635 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.145502 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551070-f22kz"] Mar 09 13:50:00 crc kubenswrapper[4703]: E0309 13:50:00.146451 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerName="copy" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.146500 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerName="copy" Mar 09 13:50:00 crc kubenswrapper[4703]: E0309 13:50:00.146522 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51432a9e-2585-4e57-a434-3ceef691ce43" containerName="oc" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.146533 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="51432a9e-2585-4e57-a434-3ceef691ce43" containerName="oc" Mar 09 13:50:00 crc kubenswrapper[4703]: E0309 13:50:00.146589 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerName="gather" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.146602 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerName="gather" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.148396 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerName="copy" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.148423 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a41f6a-d624-41bc-ac11-cf49a7e96960" containerName="gather" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.148480 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="51432a9e-2585-4e57-a434-3ceef691ce43" containerName="oc" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.149661 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-f22kz" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.155253 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-f22kz"] Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.157074 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.157422 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.157639 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.289266 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p6bf\" (UniqueName: \"kubernetes.io/projected/ffa8e1b1-3b52-427a-ba01-9741dc577c9b-kube-api-access-5p6bf\") pod \"auto-csr-approver-29551070-f22kz\" (UID: \"ffa8e1b1-3b52-427a-ba01-9741dc577c9b\") " pod="openshift-infra/auto-csr-approver-29551070-f22kz" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.390232 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p6bf\" (UniqueName: \"kubernetes.io/projected/ffa8e1b1-3b52-427a-ba01-9741dc577c9b-kube-api-access-5p6bf\") pod \"auto-csr-approver-29551070-f22kz\" (UID: \"ffa8e1b1-3b52-427a-ba01-9741dc577c9b\") " pod="openshift-infra/auto-csr-approver-29551070-f22kz" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.409738 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p6bf\" (UniqueName: \"kubernetes.io/projected/ffa8e1b1-3b52-427a-ba01-9741dc577c9b-kube-api-access-5p6bf\") pod \"auto-csr-approver-29551070-f22kz\" (UID: \"ffa8e1b1-3b52-427a-ba01-9741dc577c9b\") " pod="openshift-infra/auto-csr-approver-29551070-f22kz" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.472737 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-f22kz" Mar 09 13:50:00 crc kubenswrapper[4703]: I0309 13:50:00.909928 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-f22kz"] Mar 09 13:50:01 crc kubenswrapper[4703]: I0309 13:50:01.724400 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-f22kz" event={"ID":"ffa8e1b1-3b52-427a-ba01-9741dc577c9b","Type":"ContainerStarted","Data":"36fad5b233c8b518bce3801054cf4cdec0ea1ed4f8a8d366c29a416e085ba4c0"} Mar 09 13:50:02 crc kubenswrapper[4703]: I0309 13:50:02.735715 4703 generic.go:334] "Generic (PLEG): container finished" podID="ffa8e1b1-3b52-427a-ba01-9741dc577c9b" containerID="f4ec704096dd91dd49b61e9a9706ffcf474614bd7edfe4c98dd21472f5357370" exitCode=0 Mar 09 13:50:02 crc kubenswrapper[4703]: I0309 13:50:02.735763 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-f22kz" event={"ID":"ffa8e1b1-3b52-427a-ba01-9741dc577c9b","Type":"ContainerDied","Data":"f4ec704096dd91dd49b61e9a9706ffcf474614bd7edfe4c98dd21472f5357370"} Mar 09 13:50:04 crc kubenswrapper[4703]: I0309 13:50:04.089764 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-f22kz" Mar 09 13:50:04 crc kubenswrapper[4703]: I0309 13:50:04.251590 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p6bf\" (UniqueName: \"kubernetes.io/projected/ffa8e1b1-3b52-427a-ba01-9741dc577c9b-kube-api-access-5p6bf\") pod \"ffa8e1b1-3b52-427a-ba01-9741dc577c9b\" (UID: \"ffa8e1b1-3b52-427a-ba01-9741dc577c9b\") " Mar 09 13:50:04 crc kubenswrapper[4703]: I0309 13:50:04.259911 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa8e1b1-3b52-427a-ba01-9741dc577c9b-kube-api-access-5p6bf" (OuterVolumeSpecName: "kube-api-access-5p6bf") pod "ffa8e1b1-3b52-427a-ba01-9741dc577c9b" (UID: "ffa8e1b1-3b52-427a-ba01-9741dc577c9b"). InnerVolumeSpecName "kube-api-access-5p6bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:04 crc kubenswrapper[4703]: I0309 13:50:04.353004 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p6bf\" (UniqueName: \"kubernetes.io/projected/ffa8e1b1-3b52-427a-ba01-9741dc577c9b-kube-api-access-5p6bf\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:04 crc kubenswrapper[4703]: I0309 13:50:04.757779 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551070-f22kz" event={"ID":"ffa8e1b1-3b52-427a-ba01-9741dc577c9b","Type":"ContainerDied","Data":"36fad5b233c8b518bce3801054cf4cdec0ea1ed4f8a8d366c29a416e085ba4c0"} Mar 09 13:50:04 crc kubenswrapper[4703]: I0309 13:50:04.758156 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36fad5b233c8b518bce3801054cf4cdec0ea1ed4f8a8d366c29a416e085ba4c0" Mar 09 13:50:04 crc kubenswrapper[4703]: I0309 13:50:04.757829 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551070-f22kz" Mar 09 13:50:05 crc kubenswrapper[4703]: I0309 13:50:05.170807 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-q2lc8"] Mar 09 13:50:05 crc kubenswrapper[4703]: I0309 13:50:05.178977 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551064-q2lc8"] Mar 09 13:50:05 crc kubenswrapper[4703]: I0309 13:50:05.706989 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:50:05 crc kubenswrapper[4703]: E0309 13:50:05.707224 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:50:06 crc kubenswrapper[4703]: I0309 13:50:06.715426 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97219a68-6ef6-4f8e-947d-da434f9931c4" path="/var/lib/kubelet/pods/97219a68-6ef6-4f8e-947d-da434f9931c4/volumes" Mar 09 13:50:18 crc kubenswrapper[4703]: I0309 13:50:18.708426 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:50:18 crc kubenswrapper[4703]: E0309 13:50:18.709975 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:50:28 crc kubenswrapper[4703]: I0309 13:50:28.722466 4703 scope.go:117] "RemoveContainer" containerID="ef95328eab0a37adaa01af3ed1ddc38fa9933e9da2321bf2870f73e720f14640" Mar 09 13:50:28 crc kubenswrapper[4703]: I0309 13:50:28.769735 4703 scope.go:117] "RemoveContainer" containerID="42dc11fd013e2e4632457a5aca0780e3a954bfa4626018fcdf35f7c3fc4eb0d9" Mar 09 13:50:28 crc kubenswrapper[4703]: I0309 13:50:28.796438 4703 scope.go:117] "RemoveContainer" containerID="8b0e34fef625e494409c110885c752bcc083276cf09c6658313db16b5966e6c3" Mar 09 13:50:28 crc kubenswrapper[4703]: I0309 13:50:28.850386 4703 scope.go:117] "RemoveContainer" containerID="56a38f020f216e904effa92db314b3cb02413f88b655bf72930151beddc3a5d7" Mar 09 13:50:28 crc kubenswrapper[4703]: I0309 13:50:28.876229 4703 scope.go:117] "RemoveContainer" containerID="015172b03a8c90128f9abc845c3bb6f95d4504d5db161255f1cf3fae989f39ba" Mar 09 13:50:29 crc kubenswrapper[4703]: I0309 13:50:29.707446 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:50:29 crc kubenswrapper[4703]: E0309 13:50:29.707776 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.602452 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n8kh7"] Mar 09 13:50:38 crc kubenswrapper[4703]: E0309 13:50:38.603829 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa8e1b1-3b52-427a-ba01-9741dc577c9b" containerName="oc" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.603858 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa8e1b1-3b52-427a-ba01-9741dc577c9b" containerName="oc" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.604212 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa8e1b1-3b52-427a-ba01-9741dc577c9b" containerName="oc" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.605795 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.614056 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8kh7"] Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.621325 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-catalog-content\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.621392 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-utilities\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.621592 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4px\" (UniqueName: \"kubernetes.io/projected/67fb39e9-eb20-4e4a-85b2-d7f156982112-kube-api-access-4b4px\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.723414 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4px\" (UniqueName: \"kubernetes.io/projected/67fb39e9-eb20-4e4a-85b2-d7f156982112-kube-api-access-4b4px\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.724635 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-catalog-content\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.724780 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-utilities\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.725423 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-utilities\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.725769 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-catalog-content\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.770010 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4px\" (UniqueName: \"kubernetes.io/projected/67fb39e9-eb20-4e4a-85b2-d7f156982112-kube-api-access-4b4px\") pod \"redhat-marketplace-n8kh7\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.775040 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dq7cf"] Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.776281 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.785090 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dq7cf"] Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.826343 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-utilities\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.826430 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-catalog-content\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.826474 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbz8d\" (UniqueName: \"kubernetes.io/projected/a8ed0f0e-0307-415e-874c-16766d38b041-kube-api-access-kbz8d\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.927846 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-utilities\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.927992 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-catalog-content\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.928055 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbz8d\" (UniqueName: \"kubernetes.io/projected/a8ed0f0e-0307-415e-874c-16766d38b041-kube-api-access-kbz8d\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.928684 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-utilities\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.928884 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-catalog-content\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.936743 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:38 crc kubenswrapper[4703]: I0309 13:50:38.974316 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbz8d\" (UniqueName: \"kubernetes.io/projected/a8ed0f0e-0307-415e-874c-16766d38b041-kube-api-access-kbz8d\") pod \"community-operators-dq7cf\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:39 crc kubenswrapper[4703]: I0309 13:50:39.124710 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:39 crc kubenswrapper[4703]: I0309 13:50:39.192006 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8kh7"] Mar 09 13:50:39 crc kubenswrapper[4703]: I0309 13:50:39.394527 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dq7cf"] Mar 09 13:50:39 crc kubenswrapper[4703]: W0309 13:50:39.398303 4703 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ed0f0e_0307_415e_874c_16766d38b041.slice/crio-6d64503d9155a3b5b24c703d9f473d9581447a126af4171bb575072862cef5b8 WatchSource:0}: Error finding container 6d64503d9155a3b5b24c703d9f473d9581447a126af4171bb575072862cef5b8: Status 404 returned error can't find the container with id 6d64503d9155a3b5b24c703d9f473d9581447a126af4171bb575072862cef5b8 Mar 09 13:50:40 crc kubenswrapper[4703]: I0309 13:50:40.034119 4703 generic.go:334] "Generic (PLEG): container finished" podID="a8ed0f0e-0307-415e-874c-16766d38b041" containerID="71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09" exitCode=0 Mar 09 13:50:40 crc kubenswrapper[4703]: I0309 13:50:40.034184 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq7cf" event={"ID":"a8ed0f0e-0307-415e-874c-16766d38b041","Type":"ContainerDied","Data":"71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09"} Mar 09 13:50:40 crc kubenswrapper[4703]: I0309 13:50:40.034208 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq7cf" event={"ID":"a8ed0f0e-0307-415e-874c-16766d38b041","Type":"ContainerStarted","Data":"6d64503d9155a3b5b24c703d9f473d9581447a126af4171bb575072862cef5b8"} Mar 09 13:50:40 crc kubenswrapper[4703]: I0309 13:50:40.036411 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:50:40 crc kubenswrapper[4703]: I0309 13:50:40.036889 4703 generic.go:334] "Generic (PLEG): container finished" podID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerID="9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6" exitCode=0 Mar 09 13:50:40 crc kubenswrapper[4703]: I0309 13:50:40.036914 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8kh7" event={"ID":"67fb39e9-eb20-4e4a-85b2-d7f156982112","Type":"ContainerDied","Data":"9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6"} Mar 09 13:50:40 crc kubenswrapper[4703]: I0309 13:50:40.036930 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8kh7" event={"ID":"67fb39e9-eb20-4e4a-85b2-d7f156982112","Type":"ContainerStarted","Data":"563e12ba575172abd56c37d400f1852bf63c5ba50c4002b03f94d04212aff7aa"} Mar 09 13:50:41 crc kubenswrapper[4703]: I0309 13:50:41.046496 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq7cf" event={"ID":"a8ed0f0e-0307-415e-874c-16766d38b041","Type":"ContainerStarted","Data":"725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7"} Mar 09 13:50:41 crc kubenswrapper[4703]: I0309 13:50:41.054840 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8kh7" event={"ID":"67fb39e9-eb20-4e4a-85b2-d7f156982112","Type":"ContainerStarted","Data":"7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522"} Mar 09 13:50:42 crc kubenswrapper[4703]: I0309 13:50:42.067385 4703 generic.go:334] "Generic (PLEG): container finished" podID="a8ed0f0e-0307-415e-874c-16766d38b041" containerID="725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7" exitCode=0 Mar 09 13:50:42 crc kubenswrapper[4703]: I0309 13:50:42.067661 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq7cf" event={"ID":"a8ed0f0e-0307-415e-874c-16766d38b041","Type":"ContainerDied","Data":"725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7"} Mar 09 13:50:42 crc kubenswrapper[4703]: I0309 13:50:42.073739 4703 generic.go:334] "Generic (PLEG): container finished" podID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerID="7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522" exitCode=0 Mar 09 13:50:42 crc kubenswrapper[4703]: I0309 13:50:42.073793 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8kh7" event={"ID":"67fb39e9-eb20-4e4a-85b2-d7f156982112","Type":"ContainerDied","Data":"7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522"} Mar 09 13:50:42 crc kubenswrapper[4703]: I0309 13:50:42.707417 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:50:42 crc kubenswrapper[4703]: E0309 13:50:42.708132 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:50:43 crc kubenswrapper[4703]: I0309 13:50:43.088213 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq7cf" event={"ID":"a8ed0f0e-0307-415e-874c-16766d38b041","Type":"ContainerStarted","Data":"e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa"} Mar 09 13:50:43 crc kubenswrapper[4703]: I0309 13:50:43.093968 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8kh7" event={"ID":"67fb39e9-eb20-4e4a-85b2-d7f156982112","Type":"ContainerStarted","Data":"6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371"} Mar 09 13:50:43 crc kubenswrapper[4703]: I0309 13:50:43.123318 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dq7cf" podStartSLOduration=2.657257686 podStartE2EDuration="5.123286855s" podCreationTimestamp="2026-03-09 13:50:38 +0000 UTC" firstStartedPulling="2026-03-09 13:50:40.036210991 +0000 UTC m=+1836.003626677" lastFinishedPulling="2026-03-09 13:50:42.50224013 +0000 UTC m=+1838.469655846" observedRunningTime="2026-03-09 13:50:43.110222071 +0000 UTC m=+1839.077637817" watchObservedRunningTime="2026-03-09 13:50:43.123286855 +0000 UTC m=+1839.090702591" Mar 09 13:50:43 crc kubenswrapper[4703]: I0309 13:50:43.130901 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n8kh7" podStartSLOduration=2.282956163 podStartE2EDuration="5.130863752s" podCreationTimestamp="2026-03-09 13:50:38 +0000 UTC" firstStartedPulling="2026-03-09 13:50:40.038421964 +0000 UTC m=+1836.005837680" lastFinishedPulling="2026-03-09 13:50:42.886329583 +0000 UTC m=+1838.853745269" observedRunningTime="2026-03-09 13:50:43.12871449 +0000 UTC m=+1839.096130186" watchObservedRunningTime="2026-03-09 13:50:43.130863752 +0000 UTC m=+1839.098279448" Mar 09 13:50:48 crc kubenswrapper[4703]: I0309 13:50:48.937457 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:48 crc kubenswrapper[4703]: I0309 13:50:48.938224 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:49 crc kubenswrapper[4703]: I0309 13:50:49.003896 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:49 crc kubenswrapper[4703]: I0309 13:50:49.126350 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:49 crc kubenswrapper[4703]: I0309 13:50:49.126388 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:49 crc kubenswrapper[4703]: I0309 13:50:49.193591 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:49 crc kubenswrapper[4703]: I0309 13:50:49.203717 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:50 crc kubenswrapper[4703]: I0309 13:50:50.202338 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:50 crc kubenswrapper[4703]: I0309 13:50:50.253071 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8kh7"] Mar 09 13:50:51 crc kubenswrapper[4703]: I0309 13:50:51.143971 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n8kh7" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="registry-server" containerID="cri-o://6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371" gracePeriod=2 Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.054314 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.140448 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-utilities\") pod \"67fb39e9-eb20-4e4a-85b2-d7f156982112\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.140541 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-catalog-content\") pod \"67fb39e9-eb20-4e4a-85b2-d7f156982112\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.140619 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b4px\" (UniqueName: \"kubernetes.io/projected/67fb39e9-eb20-4e4a-85b2-d7f156982112-kube-api-access-4b4px\") pod \"67fb39e9-eb20-4e4a-85b2-d7f156982112\" (UID: \"67fb39e9-eb20-4e4a-85b2-d7f156982112\") " Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.142483 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-utilities" (OuterVolumeSpecName: "utilities") pod "67fb39e9-eb20-4e4a-85b2-d7f156982112" (UID: "67fb39e9-eb20-4e4a-85b2-d7f156982112"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.145983 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67fb39e9-eb20-4e4a-85b2-d7f156982112-kube-api-access-4b4px" (OuterVolumeSpecName: "kube-api-access-4b4px") pod "67fb39e9-eb20-4e4a-85b2-d7f156982112" (UID: "67fb39e9-eb20-4e4a-85b2-d7f156982112"). InnerVolumeSpecName "kube-api-access-4b4px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.152026 4703 generic.go:334] "Generic (PLEG): container finished" podID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerID="6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371" exitCode=0 Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.152072 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8kh7" event={"ID":"67fb39e9-eb20-4e4a-85b2-d7f156982112","Type":"ContainerDied","Data":"6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371"} Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.152101 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8kh7" event={"ID":"67fb39e9-eb20-4e4a-85b2-d7f156982112","Type":"ContainerDied","Data":"563e12ba575172abd56c37d400f1852bf63c5ba50c4002b03f94d04212aff7aa"} Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.152123 4703 scope.go:117] "RemoveContainer" containerID="6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.152251 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8kh7" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.167290 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67fb39e9-eb20-4e4a-85b2-d7f156982112" (UID: "67fb39e9-eb20-4e4a-85b2-d7f156982112"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.172402 4703 scope.go:117] "RemoveContainer" containerID="7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.189772 4703 scope.go:117] "RemoveContainer" containerID="9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.203588 4703 scope.go:117] "RemoveContainer" containerID="6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371" Mar 09 13:50:52 crc kubenswrapper[4703]: E0309 13:50:52.204041 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371\": container with ID starting with 6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371 not found: ID does not exist" containerID="6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.204075 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371"} err="failed to get container status \"6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371\": rpc error: code = NotFound desc = could not find container \"6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371\": container with ID starting with 6d11f5ef8a0ac3d5a80f1e8dac311415413a9cbc9d7743a550af20fac1750371 not found: ID does not exist" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.204093 4703 scope.go:117] "RemoveContainer" containerID="7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522" Mar 09 13:50:52 crc kubenswrapper[4703]: E0309 13:50:52.204595 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522\": container with ID starting with 7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522 not found: ID does not exist" containerID="7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.204618 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522"} err="failed to get container status \"7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522\": rpc error: code = NotFound desc = could not find container \"7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522\": container with ID starting with 7c0e0313e43084afd09947048544b440155cf127dc62063cb6877cf49ae83522 not found: ID does not exist" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.204637 4703 scope.go:117] "RemoveContainer" containerID="9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6" Mar 09 13:50:52 crc kubenswrapper[4703]: E0309 13:50:52.205210 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6\": container with ID starting with 9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6 not found: ID does not exist" containerID="9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.205261 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6"} err="failed to get container status \"9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6\": rpc error: code = NotFound desc = could not find container \"9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6\": container with ID starting with 9f36a2002e033f680a29494b648564abd6c6aeef63f87a66988c66eaabfb05b6 not found: ID does not exist" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.242008 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.242319 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67fb39e9-eb20-4e4a-85b2-d7f156982112-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.242422 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b4px\" (UniqueName: \"kubernetes.io/projected/67fb39e9-eb20-4e4a-85b2-d7f156982112-kube-api-access-4b4px\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.451322 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dq7cf"] Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.451780 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dq7cf" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="registry-server" containerID="cri-o://e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa" gracePeriod=2 Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.493622 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8kh7"] Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.498786 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8kh7"] Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.720104 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" path="/var/lib/kubelet/pods/67fb39e9-eb20-4e4a-85b2-d7f156982112/volumes" Mar 09 13:50:52 crc kubenswrapper[4703]: I0309 13:50:52.865171 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.054700 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbz8d\" (UniqueName: \"kubernetes.io/projected/a8ed0f0e-0307-415e-874c-16766d38b041-kube-api-access-kbz8d\") pod \"a8ed0f0e-0307-415e-874c-16766d38b041\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.054776 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-catalog-content\") pod \"a8ed0f0e-0307-415e-874c-16766d38b041\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.054870 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-utilities\") pod \"a8ed0f0e-0307-415e-874c-16766d38b041\" (UID: \"a8ed0f0e-0307-415e-874c-16766d38b041\") " Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.056243 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-utilities" (OuterVolumeSpecName: "utilities") pod "a8ed0f0e-0307-415e-874c-16766d38b041" (UID: "a8ed0f0e-0307-415e-874c-16766d38b041"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.059926 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ed0f0e-0307-415e-874c-16766d38b041-kube-api-access-kbz8d" (OuterVolumeSpecName: "kube-api-access-kbz8d") pod "a8ed0f0e-0307-415e-874c-16766d38b041" (UID: "a8ed0f0e-0307-415e-874c-16766d38b041"). InnerVolumeSpecName "kube-api-access-kbz8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.129167 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8ed0f0e-0307-415e-874c-16766d38b041" (UID: "a8ed0f0e-0307-415e-874c-16766d38b041"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.156383 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbz8d\" (UniqueName: \"kubernetes.io/projected/a8ed0f0e-0307-415e-874c-16766d38b041-kube-api-access-kbz8d\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.156425 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.156441 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ed0f0e-0307-415e-874c-16766d38b041-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.163466 4703 generic.go:334] "Generic (PLEG): container finished" podID="a8ed0f0e-0307-415e-874c-16766d38b041" containerID="e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa" exitCode=0 Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.163565 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq7cf" event={"ID":"a8ed0f0e-0307-415e-874c-16766d38b041","Type":"ContainerDied","Data":"e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa"} Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.163573 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dq7cf" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.163600 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dq7cf" event={"ID":"a8ed0f0e-0307-415e-874c-16766d38b041","Type":"ContainerDied","Data":"6d64503d9155a3b5b24c703d9f473d9581447a126af4171bb575072862cef5b8"} Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.163622 4703 scope.go:117] "RemoveContainer" containerID="e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.183024 4703 scope.go:117] "RemoveContainer" containerID="725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.208332 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dq7cf"] Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.213987 4703 scope.go:117] "RemoveContainer" containerID="71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.216166 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dq7cf"] Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.236406 4703 scope.go:117] "RemoveContainer" containerID="e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa" Mar 09 13:50:53 crc kubenswrapper[4703]: E0309 13:50:53.236826 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa\": container with ID starting with e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa not found: ID does not exist" containerID="e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.236957 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa"} err="failed to get container status \"e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa\": rpc error: code = NotFound desc = could not find container \"e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa\": container with ID starting with e88fae627ca3058af095b4c56289aafd50d6e10760077df149239e1d326704aa not found: ID does not exist" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.237000 4703 scope.go:117] "RemoveContainer" containerID="725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7" Mar 09 13:50:53 crc kubenswrapper[4703]: E0309 13:50:53.237412 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7\": container with ID starting with 725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7 not found: ID does not exist" containerID="725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.237494 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7"} err="failed to get container status \"725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7\": rpc error: code = NotFound desc = could not find container \"725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7\": container with ID starting with 725e83f2d792533f15e45b3ad9c2be2ea685af6820818b3d8334865b1acf72a7 not found: ID does not exist" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.237630 4703 scope.go:117] "RemoveContainer" containerID="71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09" Mar 09 13:50:53 crc kubenswrapper[4703]: E0309 13:50:53.238020 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09\": container with ID starting with 71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09 not found: ID does not exist" containerID="71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09" Mar 09 13:50:53 crc kubenswrapper[4703]: I0309 13:50:53.238051 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09"} err="failed to get container status \"71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09\": rpc error: code = NotFound desc = could not find container \"71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09\": container with ID starting with 71574ead46ca428b49e3086ae8ca49654e161729cf6e7320819de15bfb241c09 not found: ID does not exist" Mar 09 13:50:54 crc kubenswrapper[4703]: I0309 13:50:54.711300 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:50:54 crc kubenswrapper[4703]: E0309 13:50:54.712036 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:50:54 crc kubenswrapper[4703]: I0309 13:50:54.717139 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" path="/var/lib/kubelet/pods/a8ed0f0e-0307-415e-874c-16766d38b041/volumes" Mar 09 13:51:05 crc kubenswrapper[4703]: I0309 13:51:05.707120 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:51:05 crc kubenswrapper[4703]: E0309 13:51:05.707772 4703 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-pmzvj_openshift-machine-config-operator(4316a119-ceb8-44c1-a4ad-2d64ca0c0f29)\"" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" Mar 09 13:51:17 crc kubenswrapper[4703]: I0309 13:51:17.707256 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:51:18 crc kubenswrapper[4703]: I0309 13:51:18.357827 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"b3342bd6c5cd07bc6710dd5b17583cd8fbc827fddd06c4bc9f11d0410027cf33"} Mar 09 13:51:28 crc kubenswrapper[4703]: I0309 13:51:28.951310 4703 scope.go:117] "RemoveContainer" containerID="991fe9f417eebd16c20aad1957932e412e0306698ee1cd36e1c1c45c7ced3d2d" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.271767 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ckb7v/must-gather-ctzt7"] Mar 09 13:51:44 crc kubenswrapper[4703]: E0309 13:51:44.272708 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="extract-utilities" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.272733 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="extract-utilities" Mar 09 13:51:44 crc kubenswrapper[4703]: E0309 13:51:44.272761 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="registry-server" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.272774 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="registry-server" Mar 09 13:51:44 crc kubenswrapper[4703]: E0309 13:51:44.272791 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="extract-utilities" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.272804 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="extract-utilities" Mar 09 13:51:44 crc kubenswrapper[4703]: E0309 13:51:44.272865 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="extract-content" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.272880 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="extract-content" Mar 09 13:51:44 crc kubenswrapper[4703]: E0309 13:51:44.272900 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="registry-server" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.272911 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="registry-server" Mar 09 13:51:44 crc kubenswrapper[4703]: E0309 13:51:44.272931 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="extract-content" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.272939 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="extract-content" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.273055 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ed0f0e-0307-415e-874c-16766d38b041" containerName="registry-server" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.273071 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="67fb39e9-eb20-4e4a-85b2-d7f156982112" containerName="registry-server" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.273740 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.279606 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ckb7v"/"openshift-service-ca.crt" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.280251 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ckb7v"/"kube-root-ca.crt" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.287418 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ckb7v/must-gather-ctzt7"] Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.416237 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmjh\" (UniqueName: \"kubernetes.io/projected/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-kube-api-access-zhmjh\") pod \"must-gather-ctzt7\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.416294 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-must-gather-output\") pod \"must-gather-ctzt7\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.517781 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmjh\" (UniqueName: \"kubernetes.io/projected/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-kube-api-access-zhmjh\") pod \"must-gather-ctzt7\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.517880 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-must-gather-output\") pod \"must-gather-ctzt7\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.518430 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-must-gather-output\") pod \"must-gather-ctzt7\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.537985 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmjh\" (UniqueName: \"kubernetes.io/projected/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-kube-api-access-zhmjh\") pod \"must-gather-ctzt7\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.594500 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:51:44 crc kubenswrapper[4703]: I0309 13:51:44.867655 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ckb7v/must-gather-ctzt7"] Mar 09 13:51:45 crc kubenswrapper[4703]: I0309 13:51:45.580741 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" event={"ID":"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0","Type":"ContainerStarted","Data":"2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39"} Mar 09 13:51:45 crc kubenswrapper[4703]: I0309 13:51:45.581215 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" event={"ID":"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0","Type":"ContainerStarted","Data":"ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f"} Mar 09 13:51:45 crc kubenswrapper[4703]: I0309 13:51:45.581237 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" event={"ID":"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0","Type":"ContainerStarted","Data":"f94972c85468cc51f618aad153c36da58961bc9c15d2227e78f0d79cb026ae1f"} Mar 09 13:51:45 crc kubenswrapper[4703]: I0309 13:51:45.597131 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" podStartSLOduration=1.597106211 podStartE2EDuration="1.597106211s" podCreationTimestamp="2026-03-09 13:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 13:51:45.595831374 +0000 UTC m=+1901.563247060" watchObservedRunningTime="2026-03-09 13:51:45.597106211 +0000 UTC m=+1901.564521907" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.153558 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551072-ht47m"] Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.154895 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-ht47m" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.157381 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.157409 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.157735 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.162415 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-ht47m"] Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.325000 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqb57\" (UniqueName: \"kubernetes.io/projected/69dd82bb-929d-48ea-b91e-61955282e200-kube-api-access-mqb57\") pod \"auto-csr-approver-29551072-ht47m\" (UID: \"69dd82bb-929d-48ea-b91e-61955282e200\") " pod="openshift-infra/auto-csr-approver-29551072-ht47m" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.426577 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqb57\" (UniqueName: \"kubernetes.io/projected/69dd82bb-929d-48ea-b91e-61955282e200-kube-api-access-mqb57\") pod \"auto-csr-approver-29551072-ht47m\" (UID: \"69dd82bb-929d-48ea-b91e-61955282e200\") " pod="openshift-infra/auto-csr-approver-29551072-ht47m" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.464887 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqb57\" (UniqueName: \"kubernetes.io/projected/69dd82bb-929d-48ea-b91e-61955282e200-kube-api-access-mqb57\") pod \"auto-csr-approver-29551072-ht47m\" (UID: \"69dd82bb-929d-48ea-b91e-61955282e200\") " pod="openshift-infra/auto-csr-approver-29551072-ht47m" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.475695 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-ht47m" Mar 09 13:52:00 crc kubenswrapper[4703]: I0309 13:52:00.741895 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551072-ht47m"] Mar 09 13:52:01 crc kubenswrapper[4703]: I0309 13:52:01.683497 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-ht47m" event={"ID":"69dd82bb-929d-48ea-b91e-61955282e200","Type":"ContainerStarted","Data":"56f4717bebd75cc307d75f4b70bdb806be9c6659f10f139ec4579c7cf87ec6f9"} Mar 09 13:52:02 crc kubenswrapper[4703]: I0309 13:52:02.693337 4703 generic.go:334] "Generic (PLEG): container finished" podID="69dd82bb-929d-48ea-b91e-61955282e200" containerID="362038c83af74b6059a1fe71c2157b20923e5ff585074abbac17aa8bdc3c3439" exitCode=0 Mar 09 13:52:02 crc kubenswrapper[4703]: I0309 13:52:02.693423 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-ht47m" event={"ID":"69dd82bb-929d-48ea-b91e-61955282e200","Type":"ContainerDied","Data":"362038c83af74b6059a1fe71c2157b20923e5ff585074abbac17aa8bdc3c3439"} Mar 09 13:52:03 crc kubenswrapper[4703]: I0309 13:52:03.984737 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-ht47m" Mar 09 13:52:04 crc kubenswrapper[4703]: I0309 13:52:04.176567 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqb57\" (UniqueName: \"kubernetes.io/projected/69dd82bb-929d-48ea-b91e-61955282e200-kube-api-access-mqb57\") pod \"69dd82bb-929d-48ea-b91e-61955282e200\" (UID: \"69dd82bb-929d-48ea-b91e-61955282e200\") " Mar 09 13:52:04 crc kubenswrapper[4703]: I0309 13:52:04.182754 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69dd82bb-929d-48ea-b91e-61955282e200-kube-api-access-mqb57" (OuterVolumeSpecName: "kube-api-access-mqb57") pod "69dd82bb-929d-48ea-b91e-61955282e200" (UID: "69dd82bb-929d-48ea-b91e-61955282e200"). InnerVolumeSpecName "kube-api-access-mqb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:52:04 crc kubenswrapper[4703]: I0309 13:52:04.278029 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqb57\" (UniqueName: \"kubernetes.io/projected/69dd82bb-929d-48ea-b91e-61955282e200-kube-api-access-mqb57\") on node \"crc\" DevicePath \"\"" Mar 09 13:52:04 crc kubenswrapper[4703]: I0309 13:52:04.714218 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551072-ht47m" Mar 09 13:52:04 crc kubenswrapper[4703]: I0309 13:52:04.724877 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551072-ht47m" event={"ID":"69dd82bb-929d-48ea-b91e-61955282e200","Type":"ContainerDied","Data":"56f4717bebd75cc307d75f4b70bdb806be9c6659f10f139ec4579c7cf87ec6f9"} Mar 09 13:52:04 crc kubenswrapper[4703]: I0309 13:52:04.724938 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f4717bebd75cc307d75f4b70bdb806be9c6659f10f139ec4579c7cf87ec6f9" Mar 09 13:52:05 crc kubenswrapper[4703]: I0309 13:52:05.047119 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-799x4"] Mar 09 13:52:05 crc kubenswrapper[4703]: I0309 13:52:05.051217 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551066-799x4"] Mar 09 13:52:06 crc kubenswrapper[4703]: I0309 13:52:06.721147 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1" path="/var/lib/kubelet/pods/8e07cbce-01b1-4df2-a3cb-0a7d73ee51d1/volumes" Mar 09 13:52:29 crc kubenswrapper[4703]: I0309 13:52:29.044128 4703 scope.go:117] "RemoveContainer" containerID="3cd07f5da59dfaabbaa13949c32ac067d296ea3a6a0934fd5fcc5f70670c8ede" Mar 09 13:52:34 crc kubenswrapper[4703]: I0309 13:52:34.956207 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kzdxt_9b14d2ef-2e68-4bb5-be73-2bbbb837c463/control-plane-machine-set-operator/0.log" Mar 09 13:52:35 crc kubenswrapper[4703]: I0309 13:52:35.111112 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6qv87_7e8298d3-3e49-4df3-9369-2623b11981cf/kube-rbac-proxy/0.log" Mar 09 13:52:35 crc kubenswrapper[4703]: I0309 13:52:35.143314 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6qv87_7e8298d3-3e49-4df3-9369-2623b11981cf/machine-api-operator/0.log" Mar 09 13:53:03 crc kubenswrapper[4703]: I0309 13:53:03.774358 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-htlzv_1b3ec15b-631d-4ea5-b1f2-899a5d44785d/kube-rbac-proxy/0.log" Mar 09 13:53:03 crc kubenswrapper[4703]: I0309 13:53:03.779528 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-htlzv_1b3ec15b-631d-4ea5-b1f2-899a5d44785d/controller/0.log" Mar 09 13:53:03 crc kubenswrapper[4703]: I0309 13:53:03.943885 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-mqpjj_7941f29f-2aed-45e8-b9c3-d4e0c19573ee/frr-k8s-webhook-server/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.101339 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.189922 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.206463 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.236305 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.281237 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.444052 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.459345 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.481521 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.486381 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.668252 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-frr-files/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.675962 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-metrics/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.686280 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/controller/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.704553 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/cp-reloader/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.843464 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/frr-metrics/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.860775 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/kube-rbac-proxy/0.log" Mar 09 13:53:04 crc kubenswrapper[4703]: I0309 13:53:04.911923 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/kube-rbac-proxy-frr/0.log" Mar 09 13:53:05 crc kubenswrapper[4703]: I0309 13:53:05.061219 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/reloader/0.log" Mar 09 13:53:05 crc kubenswrapper[4703]: I0309 13:53:05.106863 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6c588f5d89-l7mpr_cad59883-2357-4002-a757-689c894f9c33/manager/0.log" Mar 09 13:53:05 crc kubenswrapper[4703]: I0309 13:53:05.296680 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5bcdd89498-pjlcz_f2cf06d7-6a45-4d68-9a2e-ef6bcf831e47/webhook-server/0.log" Mar 09 13:53:05 crc kubenswrapper[4703]: I0309 13:53:05.383007 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xrrnq_b39d0cb4-408c-4af7-a2e1-e3611a3eb09e/frr/0.log" Mar 09 13:53:05 crc kubenswrapper[4703]: I0309 13:53:05.400039 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qlmqf_a7841ae2-e705-49ff-a7f9-83fc45d05454/kube-rbac-proxy/0.log" Mar 09 13:53:05 crc kubenswrapper[4703]: I0309 13:53:05.581375 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qlmqf_a7841ae2-e705-49ff-a7f9-83fc45d05454/speaker/0.log" Mar 09 13:53:31 crc kubenswrapper[4703]: I0309 13:53:31.763192 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-utilities/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.038679 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-content/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.044903 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-utilities/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.066747 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-content/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.252029 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-utilities/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.298674 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/extract-content/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.501125 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-utilities/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.581343 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-78djc_a6cc879a-e440-4851-b7ef-d3894d17bd60/registry-server/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.607962 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-utilities/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.663455 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-content/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.696020 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-content/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.862036 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-utilities/0.log" Mar 09 13:53:32 crc kubenswrapper[4703]: I0309 13:53:32.887811 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/extract-content/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.134191 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/util/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.279795 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8rqdr_fbd51f58-4fd4-4d47-a229-9e94e5328d93/registry-server/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.376568 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/util/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.378539 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/pull/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.379113 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/pull/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.501566 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/util/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.535219 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/extract/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.560777 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4jrp8m_2f48ca74-d2f2-4baf-a448-e980848ac419/pull/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.696296 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q7bwg_73560a56-06ae-4ac9-91b9-4fe8478082e7/marketplace-operator/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.744246 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-utilities/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.862184 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-utilities/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.895818 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-content/0.log" Mar 09 13:53:33 crc kubenswrapper[4703]: I0309 13:53:33.912549 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-content/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.091799 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-utilities/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.092554 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/extract-content/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.180146 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fg2b7_8137ddd4-7854-4bd6-b93c-957922a36200/registry-server/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.253802 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-utilities/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.411550 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-utilities/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.411767 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-content/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.418480 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-content/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.717688 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-utilities/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.750469 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/extract-content/0.log" Mar 09 13:53:34 crc kubenswrapper[4703]: I0309 13:53:34.869581 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-685h8_9923728b-2d17-45a1-80a3-279456217b0f/registry-server/0.log" Mar 09 13:53:39 crc kubenswrapper[4703]: I0309 13:53:39.500426 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:53:39 crc kubenswrapper[4703]: I0309 13:53:39.500838 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.160861 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551074-wfrhp"] Mar 09 13:54:00 crc kubenswrapper[4703]: E0309 13:54:00.161637 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69dd82bb-929d-48ea-b91e-61955282e200" containerName="oc" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.161651 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="69dd82bb-929d-48ea-b91e-61955282e200" containerName="oc" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.161780 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="69dd82bb-929d-48ea-b91e-61955282e200" containerName="oc" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.162237 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-wfrhp" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.166560 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.166789 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.168306 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.181750 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-wfrhp"] Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.200549 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-464dh\" (UniqueName: \"kubernetes.io/projected/b997e9e7-f5a8-4c95-a3a1-980dc816858e-kube-api-access-464dh\") pod \"auto-csr-approver-29551074-wfrhp\" (UID: \"b997e9e7-f5a8-4c95-a3a1-980dc816858e\") " pod="openshift-infra/auto-csr-approver-29551074-wfrhp" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.303255 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-464dh\" (UniqueName: \"kubernetes.io/projected/b997e9e7-f5a8-4c95-a3a1-980dc816858e-kube-api-access-464dh\") pod \"auto-csr-approver-29551074-wfrhp\" (UID: \"b997e9e7-f5a8-4c95-a3a1-980dc816858e\") " pod="openshift-infra/auto-csr-approver-29551074-wfrhp" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.348632 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-464dh\" (UniqueName: \"kubernetes.io/projected/b997e9e7-f5a8-4c95-a3a1-980dc816858e-kube-api-access-464dh\") pod \"auto-csr-approver-29551074-wfrhp\" (UID: \"b997e9e7-f5a8-4c95-a3a1-980dc816858e\") " pod="openshift-infra/auto-csr-approver-29551074-wfrhp" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.489786 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-wfrhp" Mar 09 13:54:00 crc kubenswrapper[4703]: I0309 13:54:00.723430 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551074-wfrhp"] Mar 09 13:54:01 crc kubenswrapper[4703]: I0309 13:54:01.460813 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-wfrhp" event={"ID":"b997e9e7-f5a8-4c95-a3a1-980dc816858e","Type":"ContainerStarted","Data":"a024d9f4a9d98b7effda8e132b6dfb6f8e574489eeb95c89ae1758c5f9efc112"} Mar 09 13:54:02 crc kubenswrapper[4703]: I0309 13:54:02.466919 4703 generic.go:334] "Generic (PLEG): container finished" podID="b997e9e7-f5a8-4c95-a3a1-980dc816858e" containerID="eef81a112b0120147dc1fbbe80a6f8462f1ec9503058143db022fd992a9351b5" exitCode=0 Mar 09 13:54:02 crc kubenswrapper[4703]: I0309 13:54:02.467087 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-wfrhp" event={"ID":"b997e9e7-f5a8-4c95-a3a1-980dc816858e","Type":"ContainerDied","Data":"eef81a112b0120147dc1fbbe80a6f8462f1ec9503058143db022fd992a9351b5"} Mar 09 13:54:03 crc kubenswrapper[4703]: I0309 13:54:03.791540 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-wfrhp" Mar 09 13:54:03 crc kubenswrapper[4703]: I0309 13:54:03.852542 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-464dh\" (UniqueName: \"kubernetes.io/projected/b997e9e7-f5a8-4c95-a3a1-980dc816858e-kube-api-access-464dh\") pod \"b997e9e7-f5a8-4c95-a3a1-980dc816858e\" (UID: \"b997e9e7-f5a8-4c95-a3a1-980dc816858e\") " Mar 09 13:54:03 crc kubenswrapper[4703]: I0309 13:54:03.863176 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b997e9e7-f5a8-4c95-a3a1-980dc816858e-kube-api-access-464dh" (OuterVolumeSpecName: "kube-api-access-464dh") pod "b997e9e7-f5a8-4c95-a3a1-980dc816858e" (UID: "b997e9e7-f5a8-4c95-a3a1-980dc816858e"). InnerVolumeSpecName "kube-api-access-464dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:54:03 crc kubenswrapper[4703]: I0309 13:54:03.954037 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-464dh\" (UniqueName: \"kubernetes.io/projected/b997e9e7-f5a8-4c95-a3a1-980dc816858e-kube-api-access-464dh\") on node \"crc\" DevicePath \"\"" Mar 09 13:54:04 crc kubenswrapper[4703]: I0309 13:54:04.482778 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551074-wfrhp" event={"ID":"b997e9e7-f5a8-4c95-a3a1-980dc816858e","Type":"ContainerDied","Data":"a024d9f4a9d98b7effda8e132b6dfb6f8e574489eeb95c89ae1758c5f9efc112"} Mar 09 13:54:04 crc kubenswrapper[4703]: I0309 13:54:04.482842 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a024d9f4a9d98b7effda8e132b6dfb6f8e574489eeb95c89ae1758c5f9efc112" Mar 09 13:54:04 crc kubenswrapper[4703]: I0309 13:54:04.482997 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551074-wfrhp" Mar 09 13:54:04 crc kubenswrapper[4703]: I0309 13:54:04.879420 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-krbtc"] Mar 09 13:54:04 crc kubenswrapper[4703]: I0309 13:54:04.884533 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551068-krbtc"] Mar 09 13:54:06 crc kubenswrapper[4703]: I0309 13:54:06.721312 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51432a9e-2585-4e57-a434-3ceef691ce43" path="/var/lib/kubelet/pods/51432a9e-2585-4e57-a434-3ceef691ce43/volumes" Mar 09 13:54:09 crc kubenswrapper[4703]: I0309 13:54:09.500575 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:54:09 crc kubenswrapper[4703]: I0309 13:54:09.500669 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:54:29 crc kubenswrapper[4703]: I0309 13:54:29.137025 4703 scope.go:117] "RemoveContainer" containerID="a981b4dc895454936c60a2c0cd57431ed19eb530423f942402ee12ba66efd649" Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.499693 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.500480 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.500560 4703 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.501492 4703 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3342bd6c5cd07bc6710dd5b17583cd8fbc827fddd06c4bc9f11d0410027cf33"} pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.501607 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" containerID="cri-o://b3342bd6c5cd07bc6710dd5b17583cd8fbc827fddd06c4bc9f11d0410027cf33" gracePeriod=600 Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.783195 4703 generic.go:334] "Generic (PLEG): container finished" podID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerID="b3342bd6c5cd07bc6710dd5b17583cd8fbc827fddd06c4bc9f11d0410027cf33" exitCode=0 Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.783260 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerDied","Data":"b3342bd6c5cd07bc6710dd5b17583cd8fbc827fddd06c4bc9f11d0410027cf33"} Mar 09 13:54:39 crc kubenswrapper[4703]: I0309 13:54:39.783655 4703 scope.go:117] "RemoveContainer" containerID="4b23914fa14a72d49869a99c9b1715163cc0cbcc2135b43a6229d795dd35bf9f" Mar 09 13:54:40 crc kubenswrapper[4703]: I0309 13:54:40.805970 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" event={"ID":"4316a119-ceb8-44c1-a4ad-2d64ca0c0f29","Type":"ContainerStarted","Data":"05748bba0ef49be419afda2738c5710aaee3b62bfa5894bdf7bcca069549aa41"} Mar 09 13:54:45 crc kubenswrapper[4703]: I0309 13:54:45.862157 4703 generic.go:334] "Generic (PLEG): container finished" podID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerID="ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f" exitCode=0 Mar 09 13:54:45 crc kubenswrapper[4703]: I0309 13:54:45.862228 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" event={"ID":"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0","Type":"ContainerDied","Data":"ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f"} Mar 09 13:54:45 crc kubenswrapper[4703]: I0309 13:54:45.864249 4703 scope.go:117] "RemoveContainer" containerID="ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f" Mar 09 13:54:46 crc kubenswrapper[4703]: I0309 13:54:46.193362 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ckb7v_must-gather-ctzt7_3a63e24b-3a77-4f72-9a0d-0c2c294a92e0/gather/0.log" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.294921 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ckb7v/must-gather-ctzt7"] Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.297208 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerName="copy" containerID="cri-o://2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39" gracePeriod=2 Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.310766 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ckb7v/must-gather-ctzt7"] Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.679560 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ckb7v_must-gather-ctzt7_3a63e24b-3a77-4f72-9a0d-0c2c294a92e0/copy/0.log" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.680045 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.847495 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-must-gather-output\") pod \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.847582 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhmjh\" (UniqueName: \"kubernetes.io/projected/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-kube-api-access-zhmjh\") pod \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\" (UID: \"3a63e24b-3a77-4f72-9a0d-0c2c294a92e0\") " Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.854137 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-kube-api-access-zhmjh" (OuterVolumeSpecName: "kube-api-access-zhmjh") pod "3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" (UID: "3a63e24b-3a77-4f72-9a0d-0c2c294a92e0"). InnerVolumeSpecName "kube-api-access-zhmjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.904084 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" (UID: "3a63e24b-3a77-4f72-9a0d-0c2c294a92e0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.938682 4703 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ckb7v_must-gather-ctzt7_3a63e24b-3a77-4f72-9a0d-0c2c294a92e0/copy/0.log" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.939272 4703 generic.go:334] "Generic (PLEG): container finished" podID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerID="2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39" exitCode=143 Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.939355 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ckb7v/must-gather-ctzt7" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.939503 4703 scope.go:117] "RemoveContainer" containerID="2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.948626 4703 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.948665 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhmjh\" (UniqueName: \"kubernetes.io/projected/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0-kube-api-access-zhmjh\") on node \"crc\" DevicePath \"\"" Mar 09 13:54:56 crc kubenswrapper[4703]: I0309 13:54:56.969242 4703 scope.go:117] "RemoveContainer" containerID="ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f" Mar 09 13:54:57 crc kubenswrapper[4703]: I0309 13:54:57.022434 4703 scope.go:117] "RemoveContainer" containerID="2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39" Mar 09 13:54:57 crc kubenswrapper[4703]: E0309 13:54:57.033916 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39\": container with ID starting with 2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39 not found: ID does not exist" containerID="2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39" Mar 09 13:54:57 crc kubenswrapper[4703]: I0309 13:54:57.033958 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39"} err="failed to get container status \"2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39\": rpc error: code = NotFound desc = could not find container \"2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39\": container with ID starting with 2fc65c6fba885caf6e5f3b70bea9168e80ade4978036246cfe5bc69f81bb5d39 not found: ID does not exist" Mar 09 13:54:57 crc kubenswrapper[4703]: I0309 13:54:57.033982 4703 scope.go:117] "RemoveContainer" containerID="ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f" Mar 09 13:54:57 crc kubenswrapper[4703]: E0309 13:54:57.034571 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f\": container with ID starting with ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f not found: ID does not exist" containerID="ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f" Mar 09 13:54:57 crc kubenswrapper[4703]: I0309 13:54:57.034634 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f"} err="failed to get container status \"ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f\": rpc error: code = NotFound desc = could not find container \"ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f\": container with ID starting with ae2fd0c498b5a67061d9e9d8d092e0edf3e66b03c36f97154fb485f52649120f not found: ID does not exist" Mar 09 13:54:58 crc kubenswrapper[4703]: I0309 13:54:58.719533 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" path="/var/lib/kubelet/pods/3a63e24b-3a77-4f72-9a0d-0c2c294a92e0/volumes" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.047399 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wzb9r"] Mar 09 13:55:29 crc kubenswrapper[4703]: E0309 13:55:29.048215 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerName="copy" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.048227 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerName="copy" Mar 09 13:55:29 crc kubenswrapper[4703]: E0309 13:55:29.048239 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerName="gather" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.048245 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerName="gather" Mar 09 13:55:29 crc kubenswrapper[4703]: E0309 13:55:29.048262 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b997e9e7-f5a8-4c95-a3a1-980dc816858e" containerName="oc" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.048271 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="b997e9e7-f5a8-4c95-a3a1-980dc816858e" containerName="oc" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.048385 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerName="gather" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.048400 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a63e24b-3a77-4f72-9a0d-0c2c294a92e0" containerName="copy" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.048414 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="b997e9e7-f5a8-4c95-a3a1-980dc816858e" containerName="oc" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.049211 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.066635 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzb9r"] Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.210141 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-utilities\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.210184 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2prkf\" (UniqueName: \"kubernetes.io/projected/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-kube-api-access-2prkf\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.210228 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-catalog-content\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.311455 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-utilities\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.311509 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2prkf\" (UniqueName: \"kubernetes.io/projected/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-kube-api-access-2prkf\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.311557 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-catalog-content\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.312099 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-utilities\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.312127 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-catalog-content\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.337270 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2prkf\" (UniqueName: \"kubernetes.io/projected/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-kube-api-access-2prkf\") pod \"certified-operators-wzb9r\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.363766 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:29 crc kubenswrapper[4703]: I0309 13:55:29.600048 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzb9r"] Mar 09 13:55:30 crc kubenswrapper[4703]: I0309 13:55:30.174497 4703 generic.go:334] "Generic (PLEG): container finished" podID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerID="2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d" exitCode=0 Mar 09 13:55:30 crc kubenswrapper[4703]: I0309 13:55:30.174548 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb9r" event={"ID":"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da","Type":"ContainerDied","Data":"2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d"} Mar 09 13:55:30 crc kubenswrapper[4703]: I0309 13:55:30.174569 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb9r" event={"ID":"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da","Type":"ContainerStarted","Data":"170c38ac6df7cd73abe8dfdf9f027fe4d2a4de25215dfbc3f6ae08d891de1a86"} Mar 09 13:55:32 crc kubenswrapper[4703]: I0309 13:55:32.193172 4703 generic.go:334] "Generic (PLEG): container finished" podID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerID="924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c" exitCode=0 Mar 09 13:55:32 crc kubenswrapper[4703]: I0309 13:55:32.193257 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb9r" event={"ID":"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da","Type":"ContainerDied","Data":"924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c"} Mar 09 13:55:33 crc kubenswrapper[4703]: I0309 13:55:33.202553 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb9r" event={"ID":"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da","Type":"ContainerStarted","Data":"439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8"} Mar 09 13:55:33 crc kubenswrapper[4703]: I0309 13:55:33.231525 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wzb9r" podStartSLOduration=1.797349642 podStartE2EDuration="4.231493743s" podCreationTimestamp="2026-03-09 13:55:29 +0000 UTC" firstStartedPulling="2026-03-09 13:55:30.175708665 +0000 UTC m=+2126.143124351" lastFinishedPulling="2026-03-09 13:55:32.609852766 +0000 UTC m=+2128.577268452" observedRunningTime="2026-03-09 13:55:33.226694236 +0000 UTC m=+2129.194110022" watchObservedRunningTime="2026-03-09 13:55:33.231493743 +0000 UTC m=+2129.198909479" Mar 09 13:55:39 crc kubenswrapper[4703]: I0309 13:55:39.364335 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:39 crc kubenswrapper[4703]: I0309 13:55:39.364925 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:39 crc kubenswrapper[4703]: I0309 13:55:39.436394 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:40 crc kubenswrapper[4703]: I0309 13:55:40.324401 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:40 crc kubenswrapper[4703]: I0309 13:55:40.378354 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzb9r"] Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.266238 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wzb9r" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="registry-server" containerID="cri-o://439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8" gracePeriod=2 Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.771270 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.799813 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-utilities\") pod \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.800102 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-catalog-content\") pod \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.802258 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-utilities" (OuterVolumeSpecName: "utilities") pod "b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" (UID: "b9b3be23-e5c4-4883-b3b0-d44c72eaf6da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.874593 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" (UID: "b9b3be23-e5c4-4883-b3b0-d44c72eaf6da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.901272 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2prkf\" (UniqueName: \"kubernetes.io/projected/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-kube-api-access-2prkf\") pod \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\" (UID: \"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da\") " Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.901438 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.901450 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:55:42 crc kubenswrapper[4703]: I0309 13:55:42.907734 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-kube-api-access-2prkf" (OuterVolumeSpecName: "kube-api-access-2prkf") pod "b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" (UID: "b9b3be23-e5c4-4883-b3b0-d44c72eaf6da"). InnerVolumeSpecName "kube-api-access-2prkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.003063 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2prkf\" (UniqueName: \"kubernetes.io/projected/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da-kube-api-access-2prkf\") on node \"crc\" DevicePath \"\"" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.277356 4703 generic.go:334] "Generic (PLEG): container finished" podID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerID="439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8" exitCode=0 Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.277402 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb9r" event={"ID":"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da","Type":"ContainerDied","Data":"439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8"} Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.277428 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzb9r" event={"ID":"b9b3be23-e5c4-4883-b3b0-d44c72eaf6da","Type":"ContainerDied","Data":"170c38ac6df7cd73abe8dfdf9f027fe4d2a4de25215dfbc3f6ae08d891de1a86"} Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.277447 4703 scope.go:117] "RemoveContainer" containerID="439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.277513 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzb9r" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.306507 4703 scope.go:117] "RemoveContainer" containerID="924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.328417 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzb9r"] Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.335934 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wzb9r"] Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.344169 4703 scope.go:117] "RemoveContainer" containerID="2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.378605 4703 scope.go:117] "RemoveContainer" containerID="439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8" Mar 09 13:55:43 crc kubenswrapper[4703]: E0309 13:55:43.379386 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8\": container with ID starting with 439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8 not found: ID does not exist" containerID="439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.379443 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8"} err="failed to get container status \"439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8\": rpc error: code = NotFound desc = could not find container \"439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8\": container with ID starting with 439a6b572966971f71979ffb2ad10e7586e668be6135891264dfe7d8ea1927d8 not found: ID does not exist" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.379477 4703 scope.go:117] "RemoveContainer" containerID="924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c" Mar 09 13:55:43 crc kubenswrapper[4703]: E0309 13:55:43.380718 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c\": container with ID starting with 924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c not found: ID does not exist" containerID="924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.380745 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c"} err="failed to get container status \"924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c\": rpc error: code = NotFound desc = could not find container \"924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c\": container with ID starting with 924f4b4982a88a3580c76e05894b841689d1e3e7518ecb2854de0069bf4d493c not found: ID does not exist" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.380759 4703 scope.go:117] "RemoveContainer" containerID="2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d" Mar 09 13:55:43 crc kubenswrapper[4703]: E0309 13:55:43.381495 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d\": container with ID starting with 2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d not found: ID does not exist" containerID="2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d" Mar 09 13:55:43 crc kubenswrapper[4703]: I0309 13:55:43.381531 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d"} err="failed to get container status \"2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d\": rpc error: code = NotFound desc = could not find container \"2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d\": container with ID starting with 2eacfe6f2d3118d0fb1ede63455fbafc8fac81a22d8fe665f17227e689334b7d not found: ID does not exist" Mar 09 13:55:44 crc kubenswrapper[4703]: I0309 13:55:44.719800 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" path="/var/lib/kubelet/pods/b9b3be23-e5c4-4883-b3b0-d44c72eaf6da/volumes" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.154305 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551076-cm7bs"] Mar 09 13:56:00 crc kubenswrapper[4703]: E0309 13:56:00.155208 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="extract-content" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.155226 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="extract-content" Mar 09 13:56:00 crc kubenswrapper[4703]: E0309 13:56:00.155243 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="extract-utilities" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.155251 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="extract-utilities" Mar 09 13:56:00 crc kubenswrapper[4703]: E0309 13:56:00.155373 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="registry-server" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.155385 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="registry-server" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.155576 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b3be23-e5c4-4883-b3b0-d44c72eaf6da" containerName="registry-server" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.156165 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-cm7bs" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.159268 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.160625 4703 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.161262 4703 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rkrjn" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.169898 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-cm7bs"] Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.256230 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77dpc\" (UniqueName: \"kubernetes.io/projected/b021ac30-1ddb-4c74-afba-9d72d663eafc-kube-api-access-77dpc\") pod \"auto-csr-approver-29551076-cm7bs\" (UID: \"b021ac30-1ddb-4c74-afba-9d72d663eafc\") " pod="openshift-infra/auto-csr-approver-29551076-cm7bs" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.358667 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77dpc\" (UniqueName: \"kubernetes.io/projected/b021ac30-1ddb-4c74-afba-9d72d663eafc-kube-api-access-77dpc\") pod \"auto-csr-approver-29551076-cm7bs\" (UID: \"b021ac30-1ddb-4c74-afba-9d72d663eafc\") " pod="openshift-infra/auto-csr-approver-29551076-cm7bs" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.393101 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77dpc\" (UniqueName: \"kubernetes.io/projected/b021ac30-1ddb-4c74-afba-9d72d663eafc-kube-api-access-77dpc\") pod \"auto-csr-approver-29551076-cm7bs\" (UID: \"b021ac30-1ddb-4c74-afba-9d72d663eafc\") " pod="openshift-infra/auto-csr-approver-29551076-cm7bs" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.478068 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-cm7bs" Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.717822 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551076-cm7bs"] Mar 09 13:56:00 crc kubenswrapper[4703]: I0309 13:56:00.720965 4703 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 13:56:01 crc kubenswrapper[4703]: I0309 13:56:01.416672 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-cm7bs" event={"ID":"b021ac30-1ddb-4c74-afba-9d72d663eafc","Type":"ContainerStarted","Data":"1e253bddacf8a480b8e76beb64ffa15c4740856b7e53da37de4e5598da83df3e"} Mar 09 13:56:02 crc kubenswrapper[4703]: I0309 13:56:02.424734 4703 generic.go:334] "Generic (PLEG): container finished" podID="b021ac30-1ddb-4c74-afba-9d72d663eafc" containerID="4349f83f185735c6034c1da2211bd73564834ca85bb358ccd6359cf77a430691" exitCode=0 Mar 09 13:56:02 crc kubenswrapper[4703]: I0309 13:56:02.424802 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-cm7bs" event={"ID":"b021ac30-1ddb-4c74-afba-9d72d663eafc","Type":"ContainerDied","Data":"4349f83f185735c6034c1da2211bd73564834ca85bb358ccd6359cf77a430691"} Mar 09 13:56:03 crc kubenswrapper[4703]: I0309 13:56:03.707343 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-cm7bs" Mar 09 13:56:03 crc kubenswrapper[4703]: I0309 13:56:03.806424 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77dpc\" (UniqueName: \"kubernetes.io/projected/b021ac30-1ddb-4c74-afba-9d72d663eafc-kube-api-access-77dpc\") pod \"b021ac30-1ddb-4c74-afba-9d72d663eafc\" (UID: \"b021ac30-1ddb-4c74-afba-9d72d663eafc\") " Mar 09 13:56:03 crc kubenswrapper[4703]: I0309 13:56:03.813141 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b021ac30-1ddb-4c74-afba-9d72d663eafc-kube-api-access-77dpc" (OuterVolumeSpecName: "kube-api-access-77dpc") pod "b021ac30-1ddb-4c74-afba-9d72d663eafc" (UID: "b021ac30-1ddb-4c74-afba-9d72d663eafc"). InnerVolumeSpecName "kube-api-access-77dpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:56:03 crc kubenswrapper[4703]: I0309 13:56:03.909824 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77dpc\" (UniqueName: \"kubernetes.io/projected/b021ac30-1ddb-4c74-afba-9d72d663eafc-kube-api-access-77dpc\") on node \"crc\" DevicePath \"\"" Mar 09 13:56:04 crc kubenswrapper[4703]: I0309 13:56:04.446012 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551076-cm7bs" event={"ID":"b021ac30-1ddb-4c74-afba-9d72d663eafc","Type":"ContainerDied","Data":"1e253bddacf8a480b8e76beb64ffa15c4740856b7e53da37de4e5598da83df3e"} Mar 09 13:56:04 crc kubenswrapper[4703]: I0309 13:56:04.446076 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551076-cm7bs" Mar 09 13:56:04 crc kubenswrapper[4703]: I0309 13:56:04.446100 4703 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e253bddacf8a480b8e76beb64ffa15c4740856b7e53da37de4e5598da83df3e" Mar 09 13:56:04 crc kubenswrapper[4703]: I0309 13:56:04.781995 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-f22kz"] Mar 09 13:56:04 crc kubenswrapper[4703]: I0309 13:56:04.787470 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551070-f22kz"] Mar 09 13:56:06 crc kubenswrapper[4703]: I0309 13:56:06.720398 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa8e1b1-3b52-427a-ba01-9741dc577c9b" path="/var/lib/kubelet/pods/ffa8e1b1-3b52-427a-ba01-9741dc577c9b/volumes" Mar 09 13:56:29 crc kubenswrapper[4703]: I0309 13:56:29.238995 4703 scope.go:117] "RemoveContainer" containerID="f4ec704096dd91dd49b61e9a9706ffcf474614bd7edfe4c98dd21472f5357370" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.075187 4703 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-555sh"] Mar 09 13:56:39 crc kubenswrapper[4703]: E0309 13:56:39.083405 4703 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b021ac30-1ddb-4c74-afba-9d72d663eafc" containerName="oc" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.083493 4703 state_mem.go:107] "Deleted CPUSet assignment" podUID="b021ac30-1ddb-4c74-afba-9d72d663eafc" containerName="oc" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.085091 4703 memory_manager.go:354] "RemoveStaleState removing state" podUID="b021ac30-1ddb-4c74-afba-9d72d663eafc" containerName="oc" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.086910 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.098043 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-555sh"] Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.203378 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-catalog-content\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.204047 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-utilities\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.204330 4703 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tblqn\" (UniqueName: \"kubernetes.io/projected/0f8f33a0-8dcd-4081-904d-cafd0874e070-kube-api-access-tblqn\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.305894 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tblqn\" (UniqueName: \"kubernetes.io/projected/0f8f33a0-8dcd-4081-904d-cafd0874e070-kube-api-access-tblqn\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.306044 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-catalog-content\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.306128 4703 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-utilities\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.306877 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-utilities\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.307125 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-catalog-content\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.332737 4703 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tblqn\" (UniqueName: \"kubernetes.io/projected/0f8f33a0-8dcd-4081-904d-cafd0874e070-kube-api-access-tblqn\") pod \"redhat-operators-555sh\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.419248 4703 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.500462 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.500539 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 13:56:39 crc kubenswrapper[4703]: I0309 13:56:39.844734 4703 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-555sh"] Mar 09 13:56:40 crc kubenswrapper[4703]: I0309 13:56:40.108377 4703 generic.go:334] "Generic (PLEG): container finished" podID="0f8f33a0-8dcd-4081-904d-cafd0874e070" containerID="3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d" exitCode=0 Mar 09 13:56:40 crc kubenswrapper[4703]: I0309 13:56:40.108477 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-555sh" event={"ID":"0f8f33a0-8dcd-4081-904d-cafd0874e070","Type":"ContainerDied","Data":"3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d"} Mar 09 13:56:40 crc kubenswrapper[4703]: I0309 13:56:40.108601 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-555sh" event={"ID":"0f8f33a0-8dcd-4081-904d-cafd0874e070","Type":"ContainerStarted","Data":"e752ad5b6927c172f0dcb22a622959dbf37bf87af7f50d75c3adcae372526760"} Mar 09 13:56:42 crc kubenswrapper[4703]: I0309 13:56:42.133961 4703 generic.go:334] "Generic (PLEG): container finished" podID="0f8f33a0-8dcd-4081-904d-cafd0874e070" containerID="ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98" exitCode=0 Mar 09 13:56:42 crc kubenswrapper[4703]: I0309 13:56:42.134064 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-555sh" event={"ID":"0f8f33a0-8dcd-4081-904d-cafd0874e070","Type":"ContainerDied","Data":"ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98"} Mar 09 13:56:43 crc kubenswrapper[4703]: I0309 13:56:43.146992 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-555sh" event={"ID":"0f8f33a0-8dcd-4081-904d-cafd0874e070","Type":"ContainerStarted","Data":"255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c"} Mar 09 13:56:43 crc kubenswrapper[4703]: I0309 13:56:43.179700 4703 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-555sh" podStartSLOduration=1.705992117 podStartE2EDuration="4.179674377s" podCreationTimestamp="2026-03-09 13:56:39 +0000 UTC" firstStartedPulling="2026-03-09 13:56:40.111905065 +0000 UTC m=+2196.079320761" lastFinishedPulling="2026-03-09 13:56:42.585587325 +0000 UTC m=+2198.553003021" observedRunningTime="2026-03-09 13:56:43.176938359 +0000 UTC m=+2199.144354105" watchObservedRunningTime="2026-03-09 13:56:43.179674377 +0000 UTC m=+2199.147090093" Mar 09 13:56:49 crc kubenswrapper[4703]: I0309 13:56:49.419517 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:49 crc kubenswrapper[4703]: I0309 13:56:49.420331 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:50 crc kubenswrapper[4703]: I0309 13:56:50.473189 4703 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-555sh" podUID="0f8f33a0-8dcd-4081-904d-cafd0874e070" containerName="registry-server" probeResult="failure" output=< Mar 09 13:56:50 crc kubenswrapper[4703]: timeout: failed to connect service ":50051" within 1s Mar 09 13:56:50 crc kubenswrapper[4703]: > Mar 09 13:56:59 crc kubenswrapper[4703]: I0309 13:56:59.469537 4703 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:59 crc kubenswrapper[4703]: I0309 13:56:59.545911 4703 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:56:59 crc kubenswrapper[4703]: I0309 13:56:59.716721 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-555sh"] Mar 09 13:57:01 crc kubenswrapper[4703]: I0309 13:57:01.274533 4703 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-555sh" podUID="0f8f33a0-8dcd-4081-904d-cafd0874e070" containerName="registry-server" containerID="cri-o://255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c" gracePeriod=2 Mar 09 13:57:01 crc kubenswrapper[4703]: I0309 13:57:01.743155 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:57:01 crc kubenswrapper[4703]: I0309 13:57:01.903027 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-catalog-content\") pod \"0f8f33a0-8dcd-4081-904d-cafd0874e070\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " Mar 09 13:57:01 crc kubenswrapper[4703]: I0309 13:57:01.903161 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-utilities\") pod \"0f8f33a0-8dcd-4081-904d-cafd0874e070\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " Mar 09 13:57:01 crc kubenswrapper[4703]: I0309 13:57:01.903267 4703 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tblqn\" (UniqueName: \"kubernetes.io/projected/0f8f33a0-8dcd-4081-904d-cafd0874e070-kube-api-access-tblqn\") pod \"0f8f33a0-8dcd-4081-904d-cafd0874e070\" (UID: \"0f8f33a0-8dcd-4081-904d-cafd0874e070\") " Mar 09 13:57:01 crc kubenswrapper[4703]: I0309 13:57:01.903940 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-utilities" (OuterVolumeSpecName: "utilities") pod "0f8f33a0-8dcd-4081-904d-cafd0874e070" (UID: "0f8f33a0-8dcd-4081-904d-cafd0874e070"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:57:01 crc kubenswrapper[4703]: I0309 13:57:01.910042 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8f33a0-8dcd-4081-904d-cafd0874e070-kube-api-access-tblqn" (OuterVolumeSpecName: "kube-api-access-tblqn") pod "0f8f33a0-8dcd-4081-904d-cafd0874e070" (UID: "0f8f33a0-8dcd-4081-904d-cafd0874e070"). InnerVolumeSpecName "kube-api-access-tblqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.004839 4703 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tblqn\" (UniqueName: \"kubernetes.io/projected/0f8f33a0-8dcd-4081-904d-cafd0874e070-kube-api-access-tblqn\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.004903 4703 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.072402 4703 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f8f33a0-8dcd-4081-904d-cafd0874e070" (UID: "0f8f33a0-8dcd-4081-904d-cafd0874e070"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.107680 4703 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f8f33a0-8dcd-4081-904d-cafd0874e070-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.287253 4703 generic.go:334] "Generic (PLEG): container finished" podID="0f8f33a0-8dcd-4081-904d-cafd0874e070" containerID="255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c" exitCode=0 Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.287314 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-555sh" event={"ID":"0f8f33a0-8dcd-4081-904d-cafd0874e070","Type":"ContainerDied","Data":"255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c"} Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.287325 4703 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-555sh" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.287354 4703 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-555sh" event={"ID":"0f8f33a0-8dcd-4081-904d-cafd0874e070","Type":"ContainerDied","Data":"e752ad5b6927c172f0dcb22a622959dbf37bf87af7f50d75c3adcae372526760"} Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.287382 4703 scope.go:117] "RemoveContainer" containerID="255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.313090 4703 scope.go:117] "RemoveContainer" containerID="ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.339353 4703 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-555sh"] Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.349064 4703 scope.go:117] "RemoveContainer" containerID="3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.355168 4703 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-555sh"] Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.390958 4703 scope.go:117] "RemoveContainer" containerID="255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c" Mar 09 13:57:02 crc kubenswrapper[4703]: E0309 13:57:02.391599 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c\": container with ID starting with 255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c not found: ID does not exist" containerID="255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.391658 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c"} err="failed to get container status \"255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c\": rpc error: code = NotFound desc = could not find container \"255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c\": container with ID starting with 255848d3631f70f3c3a049f30e6a2a9c25261cb1120015f592056bcca7dffb8c not found: ID does not exist" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.391696 4703 scope.go:117] "RemoveContainer" containerID="ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98" Mar 09 13:57:02 crc kubenswrapper[4703]: E0309 13:57:02.392215 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98\": container with ID starting with ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98 not found: ID does not exist" containerID="ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.392297 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98"} err="failed to get container status \"ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98\": rpc error: code = NotFound desc = could not find container \"ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98\": container with ID starting with ef9d62b034a740598ddfae721fa6bacecd32959944a40b0db5d4bdb9f687ae98 not found: ID does not exist" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.392324 4703 scope.go:117] "RemoveContainer" containerID="3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d" Mar 09 13:57:02 crc kubenswrapper[4703]: E0309 13:57:02.392728 4703 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d\": container with ID starting with 3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d not found: ID does not exist" containerID="3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.392814 4703 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d"} err="failed to get container status \"3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d\": rpc error: code = NotFound desc = could not find container \"3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d\": container with ID starting with 3f579e32f91b269db039e4e490669259037443e994eac9ac39a8dcb8d395729d not found: ID does not exist" Mar 09 13:57:02 crc kubenswrapper[4703]: I0309 13:57:02.724799 4703 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8f33a0-8dcd-4081-904d-cafd0874e070" path="/var/lib/kubelet/pods/0f8f33a0-8dcd-4081-904d-cafd0874e070/volumes" Mar 09 13:57:09 crc kubenswrapper[4703]: I0309 13:57:09.500772 4703 patch_prober.go:28] interesting pod/machine-config-daemon-pmzvj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 13:57:09 crc kubenswrapper[4703]: I0309 13:57:09.501532 4703 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-pmzvj" podUID="4316a119-ceb8-44c1-a4ad-2d64ca0c0f29" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"